Nov 27 11:09:17 crc systemd[1]: Starting Kubernetes Kubelet... Nov 27 11:09:17 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:17 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:18 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 27 11:09:18 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 27 11:09:19 crc kubenswrapper[4807]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 11:09:19 crc kubenswrapper[4807]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 27 11:09:19 crc kubenswrapper[4807]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 11:09:19 crc kubenswrapper[4807]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 11:09:19 crc kubenswrapper[4807]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 27 11:09:19 crc kubenswrapper[4807]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.312765 4807 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.320965 4807 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321004 4807 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321011 4807 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321017 4807 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321023 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321029 4807 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321035 4807 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321041 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321046 4807 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321059 4807 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321065 4807 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321072 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321079 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321086 4807 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321092 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321099 4807 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321106 4807 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321113 4807 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321120 4807 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321126 4807 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321131 4807 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321138 4807 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321145 4807 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321151 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321157 4807 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321162 4807 feature_gate.go:330] unrecognized feature gate: Example Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321168 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321174 4807 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321179 4807 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321185 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321191 4807 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321198 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321204 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321210 4807 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321216 4807 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321223 4807 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321229 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321235 4807 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321242 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321273 4807 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321279 4807 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321286 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321295 4807 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321301 4807 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321309 4807 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321318 4807 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321324 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321330 4807 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321336 4807 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321342 4807 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321348 4807 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321354 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321360 4807 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321367 4807 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321372 4807 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321377 4807 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321384 4807 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321391 4807 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321397 4807 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321402 4807 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321407 4807 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321412 4807 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321419 4807 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321426 4807 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321432 4807 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321438 4807 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321443 4807 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321450 4807 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321455 4807 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321461 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.321467 4807 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321586 4807 flags.go:64] FLAG: --address="0.0.0.0" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321598 4807 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321609 4807 flags.go:64] FLAG: --anonymous-auth="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321618 4807 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321626 4807 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321632 4807 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321641 4807 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321649 4807 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321656 4807 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321663 4807 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321670 4807 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321677 4807 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321684 4807 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321690 4807 flags.go:64] FLAG: --cgroup-root="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321696 4807 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321702 4807 flags.go:64] FLAG: --client-ca-file="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321708 4807 flags.go:64] FLAG: --cloud-config="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321713 4807 flags.go:64] FLAG: --cloud-provider="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321720 4807 flags.go:64] FLAG: --cluster-dns="[]" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321728 4807 flags.go:64] FLAG: --cluster-domain="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321736 4807 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321745 4807 flags.go:64] FLAG: --config-dir="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321752 4807 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321761 4807 flags.go:64] FLAG: --container-log-max-files="5" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321772 4807 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321779 4807 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321787 4807 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321795 4807 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321803 4807 flags.go:64] FLAG: --contention-profiling="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321826 4807 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321836 4807 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321843 4807 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321849 4807 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321859 4807 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321867 4807 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321874 4807 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321882 4807 flags.go:64] FLAG: --enable-load-reader="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321890 4807 flags.go:64] FLAG: --enable-server="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321898 4807 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321909 4807 flags.go:64] FLAG: --event-burst="100" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321917 4807 flags.go:64] FLAG: --event-qps="50" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321923 4807 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321929 4807 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321937 4807 flags.go:64] FLAG: --eviction-hard="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321947 4807 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321954 4807 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321963 4807 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321971 4807 flags.go:64] FLAG: --eviction-soft="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321979 4807 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321986 4807 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.321993 4807 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322000 4807 flags.go:64] FLAG: --experimental-mounter-path="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322007 4807 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322016 4807 flags.go:64] FLAG: --fail-swap-on="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322024 4807 flags.go:64] FLAG: --feature-gates="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322032 4807 flags.go:64] FLAG: --file-check-frequency="20s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322038 4807 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322045 4807 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322052 4807 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322059 4807 flags.go:64] FLAG: --healthz-port="10248" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322065 4807 flags.go:64] FLAG: --help="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322071 4807 flags.go:64] FLAG: --hostname-override="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322077 4807 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322084 4807 flags.go:64] FLAG: --http-check-frequency="20s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322091 4807 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322097 4807 flags.go:64] FLAG: --image-credential-provider-config="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322103 4807 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322109 4807 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322115 4807 flags.go:64] FLAG: --image-service-endpoint="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322121 4807 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322128 4807 flags.go:64] FLAG: --kube-api-burst="100" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322134 4807 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322141 4807 flags.go:64] FLAG: --kube-api-qps="50" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322147 4807 flags.go:64] FLAG: --kube-reserved="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322153 4807 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322159 4807 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322166 4807 flags.go:64] FLAG: --kubelet-cgroups="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322172 4807 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322178 4807 flags.go:64] FLAG: --lock-file="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322184 4807 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322191 4807 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322197 4807 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322207 4807 flags.go:64] FLAG: --log-json-split-stream="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322213 4807 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322219 4807 flags.go:64] FLAG: --log-text-split-stream="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322226 4807 flags.go:64] FLAG: --logging-format="text" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322232 4807 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322239 4807 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322270 4807 flags.go:64] FLAG: --manifest-url="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322276 4807 flags.go:64] FLAG: --manifest-url-header="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322285 4807 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322292 4807 flags.go:64] FLAG: --max-open-files="1000000" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322300 4807 flags.go:64] FLAG: --max-pods="110" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322308 4807 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322315 4807 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322323 4807 flags.go:64] FLAG: --memory-manager-policy="None" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322330 4807 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322338 4807 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322346 4807 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322354 4807 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322371 4807 flags.go:64] FLAG: --node-status-max-images="50" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322379 4807 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322387 4807 flags.go:64] FLAG: --oom-score-adj="-999" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322395 4807 flags.go:64] FLAG: --pod-cidr="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322402 4807 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322415 4807 flags.go:64] FLAG: --pod-manifest-path="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322424 4807 flags.go:64] FLAG: --pod-max-pids="-1" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322431 4807 flags.go:64] FLAG: --pods-per-core="0" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322438 4807 flags.go:64] FLAG: --port="10250" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322445 4807 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322453 4807 flags.go:64] FLAG: --provider-id="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322460 4807 flags.go:64] FLAG: --qos-reserved="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322469 4807 flags.go:64] FLAG: --read-only-port="10255" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322477 4807 flags.go:64] FLAG: --register-node="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322484 4807 flags.go:64] FLAG: --register-schedulable="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322492 4807 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322506 4807 flags.go:64] FLAG: --registry-burst="10" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322513 4807 flags.go:64] FLAG: --registry-qps="5" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322521 4807 flags.go:64] FLAG: --reserved-cpus="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322527 4807 flags.go:64] FLAG: --reserved-memory="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322535 4807 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322541 4807 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322548 4807 flags.go:64] FLAG: --rotate-certificates="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322555 4807 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322563 4807 flags.go:64] FLAG: --runonce="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322572 4807 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322580 4807 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322589 4807 flags.go:64] FLAG: --seccomp-default="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322597 4807 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322604 4807 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322612 4807 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322620 4807 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322626 4807 flags.go:64] FLAG: --storage-driver-password="root" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322632 4807 flags.go:64] FLAG: --storage-driver-secure="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322639 4807 flags.go:64] FLAG: --storage-driver-table="stats" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322645 4807 flags.go:64] FLAG: --storage-driver-user="root" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322653 4807 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322662 4807 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322670 4807 flags.go:64] FLAG: --system-cgroups="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322677 4807 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322690 4807 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322698 4807 flags.go:64] FLAG: --tls-cert-file="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322706 4807 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322716 4807 flags.go:64] FLAG: --tls-min-version="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322723 4807 flags.go:64] FLAG: --tls-private-key-file="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322731 4807 flags.go:64] FLAG: --topology-manager-policy="none" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322738 4807 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322745 4807 flags.go:64] FLAG: --topology-manager-scope="container" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322752 4807 flags.go:64] FLAG: --v="2" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322763 4807 flags.go:64] FLAG: --version="false" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322775 4807 flags.go:64] FLAG: --vmodule="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322784 4807 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.322793 4807 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.322990 4807 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323001 4807 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323008 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323017 4807 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323025 4807 feature_gate.go:330] unrecognized feature gate: Example Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323033 4807 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323040 4807 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323047 4807 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323054 4807 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323061 4807 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323067 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323076 4807 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323085 4807 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323093 4807 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323100 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323108 4807 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323115 4807 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323121 4807 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323128 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323134 4807 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323141 4807 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323148 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323155 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323161 4807 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323168 4807 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323174 4807 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323182 4807 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323188 4807 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323195 4807 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323202 4807 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323209 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323215 4807 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323222 4807 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323228 4807 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323235 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323267 4807 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323276 4807 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323286 4807 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323293 4807 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323301 4807 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323308 4807 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323315 4807 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323323 4807 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323329 4807 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323336 4807 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323342 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323349 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323356 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323362 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323368 4807 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323375 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323381 4807 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323389 4807 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323395 4807 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323402 4807 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323408 4807 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323416 4807 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323424 4807 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323430 4807 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323437 4807 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323448 4807 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323453 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323459 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323465 4807 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323470 4807 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323475 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323480 4807 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323485 4807 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323490 4807 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323496 4807 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.323500 4807 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.327763 4807 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.340751 4807 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.340802 4807 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.340945 4807 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.340959 4807 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.340968 4807 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.340976 4807 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.340984 4807 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.340993 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341001 4807 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341009 4807 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341017 4807 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341025 4807 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341034 4807 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341042 4807 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341050 4807 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341058 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341066 4807 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341074 4807 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341085 4807 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341096 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341105 4807 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341115 4807 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341123 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341133 4807 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341141 4807 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341149 4807 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341157 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341167 4807 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341175 4807 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341184 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341192 4807 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341200 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341207 4807 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341215 4807 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341223 4807 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341230 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341240 4807 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341275 4807 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341284 4807 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341293 4807 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341301 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341310 4807 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341318 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341329 4807 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341340 4807 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341351 4807 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341360 4807 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341369 4807 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341380 4807 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341389 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341398 4807 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341407 4807 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341415 4807 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341423 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341431 4807 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341440 4807 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341448 4807 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341455 4807 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341464 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341472 4807 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341480 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341488 4807 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341495 4807 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341503 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341511 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341519 4807 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341527 4807 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341535 4807 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341543 4807 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341551 4807 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341559 4807 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341567 4807 feature_gate.go:330] unrecognized feature gate: Example Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341575 4807 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.341588 4807 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341817 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341831 4807 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341840 4807 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341849 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341858 4807 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341866 4807 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341876 4807 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341885 4807 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341894 4807 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341902 4807 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341910 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341919 4807 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341930 4807 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341940 4807 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341950 4807 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341959 4807 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341968 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341977 4807 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341985 4807 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.341994 4807 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342002 4807 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342010 4807 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342018 4807 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342027 4807 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342036 4807 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342044 4807 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342053 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342061 4807 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342069 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342077 4807 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342084 4807 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342092 4807 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342101 4807 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342109 4807 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342117 4807 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342125 4807 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342133 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342143 4807 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342154 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342164 4807 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342172 4807 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342181 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342189 4807 feature_gate.go:330] unrecognized feature gate: Example Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342197 4807 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342204 4807 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342212 4807 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342220 4807 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342228 4807 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342236 4807 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342266 4807 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342275 4807 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342283 4807 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342290 4807 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342299 4807 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342306 4807 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342314 4807 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342322 4807 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342330 4807 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342338 4807 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342346 4807 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342354 4807 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342362 4807 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342372 4807 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342381 4807 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342390 4807 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342397 4807 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342405 4807 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342412 4807 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342423 4807 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342433 4807 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.342442 4807 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.342454 4807 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.342709 4807 server.go:940] "Client rotation is on, will bootstrap in background" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.356574 4807 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.356729 4807 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.358661 4807 server.go:997] "Starting client certificate rotation" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.358705 4807 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.358914 4807 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-10 01:49:22.722788154 +0000 UTC Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.359004 4807 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1046h40m3.36378709s for next certificate rotation Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.392234 4807 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.394277 4807 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.409585 4807 log.go:25] "Validated CRI v1 runtime API" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.439283 4807 log.go:25] "Validated CRI v1 image API" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.440917 4807 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.444488 4807 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-27-11-05-12-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.444524 4807 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.459466 4807 manager.go:217] Machine: {Timestamp:2025-11-27 11:09:19.457728729 +0000 UTC m=+0.557226947 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e BootID:ab395288-9712-459d-800d-cc193ee1f597 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c4:7a:03 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c4:7a:03 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b3:9c:bd Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c2:83:3b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:73:2e:39 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:8b:31:2d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:9e:44:ba:cc:dc:2e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a2:21:e7:46:a3:58 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.459926 4807 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.460114 4807 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.460602 4807 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.460823 4807 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.460914 4807 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.461171 4807 topology_manager.go:138] "Creating topology manager with none policy" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.461229 4807 container_manager_linux.go:303] "Creating device plugin manager" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.461809 4807 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.461888 4807 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.462160 4807 state_mem.go:36] "Initialized new in-memory state store" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.462309 4807 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.465573 4807 kubelet.go:418] "Attempting to sync node with API server" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.465677 4807 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.465770 4807 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.465847 4807 kubelet.go:324] "Adding apiserver pod source" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.465924 4807 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.469640 4807 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.470915 4807 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.473162 4807 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.473537 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.473533 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.473618 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.473638 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.474863 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.474913 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.474933 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.474945 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.474966 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.474979 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.474992 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.475020 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.475038 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.475050 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.475068 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.475080 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.476055 4807 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.476780 4807 server.go:1280] "Started kubelet" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.476913 4807 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.477557 4807 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.477552 4807 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.478057 4807 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 27 11:09:19 crc systemd[1]: Started Kubernetes Kubelet. Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.479381 4807 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.479421 4807 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.479722 4807 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 03:21:18.455162681 +0000 UTC Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.484417 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.486209 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.483842 4807 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.483799 4807 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.486270 4807 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.483817 4807 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.486524 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="200ms" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.487934 4807 factory.go:55] Registering systemd factory Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.487966 4807 factory.go:221] Registration of the systemd container factory successfully Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.488414 4807 factory.go:153] Registering CRI-O factory Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.488457 4807 factory.go:221] Registration of the crio container factory successfully Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.488553 4807 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.488591 4807 factory.go:103] Registering Raw factory Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.488616 4807 manager.go:1196] Started watching for new ooms in manager Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.488633 4807 server.go:460] "Adding debug handlers to kubelet server" Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.487402 4807 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187bd8886b858168 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 11:09:19.476744552 +0000 UTC m=+0.576242760,LastTimestamp:2025-11-27 11:09:19.476744552 +0000 UTC m=+0.576242760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.489586 4807 manager.go:319] Starting recovery of all containers Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498103 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498202 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498220 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498232 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498298 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498309 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498320 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498331 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498344 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498357 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498369 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498382 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498397 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498458 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498472 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498484 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498496 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498511 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498524 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498535 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498577 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498593 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498605 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498617 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498627 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498639 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498703 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498716 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498728 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498738 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498747 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498757 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498767 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498777 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498816 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498828 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498839 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498853 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498869 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498882 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498894 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498905 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498917 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498929 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498945 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498958 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498972 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.498988 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499008 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499025 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499040 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499055 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499079 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499092 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499103 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499115 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499126 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499137 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499149 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499161 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499172 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499183 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499197 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499208 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499219 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499229 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499258 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499270 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499281 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499289 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499298 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499311 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499330 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499341 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499351 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499360 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499371 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499380 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499392 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499404 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499416 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499425 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499435 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499445 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499457 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499470 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499482 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499493 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499504 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499514 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499527 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499538 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499551 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499561 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499573 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499584 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499597 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499608 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499619 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499630 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499641 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499652 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499665 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499677 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499699 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499714 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499727 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499740 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499789 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499811 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499823 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499835 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499850 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499863 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499877 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499889 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.499903 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.501991 4807 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502034 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502052 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502066 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502079 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502089 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502100 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502110 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502125 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502145 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502158 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502171 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502182 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502194 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502208 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502218 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502230 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502242 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502271 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502284 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502295 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502309 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502323 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502335 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502349 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502363 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502421 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502437 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502451 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502467 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502484 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502503 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502532 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502548 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502561 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502574 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502589 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502604 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502621 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502635 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502649 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502663 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502677 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502694 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502709 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502723 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502736 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502749 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502764 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502778 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502793 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502810 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502823 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502837 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502850 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502864 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502877 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502888 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502900 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502915 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502950 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502961 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502972 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502984 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.502995 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503006 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503018 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503030 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503044 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503057 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503068 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503082 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503094 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503107 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503119 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503132 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503143 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503154 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503165 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503177 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503189 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503200 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503212 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503225 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503236 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503266 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503278 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503289 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503299 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503310 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503321 4807 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503334 4807 reconstruct.go:97] "Volume reconstruction finished" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.503342 4807 reconciler.go:26] "Reconciler: start to sync state" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.509890 4807 manager.go:324] Recovery completed Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.520198 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.522276 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.522313 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.522323 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.522885 4807 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.522896 4807 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.522913 4807 state_mem.go:36] "Initialized new in-memory state store" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.528583 4807 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.530936 4807 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.530997 4807 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.531054 4807 kubelet.go:2335] "Starting kubelet main sync loop" Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.531120 4807 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 27 11:09:19 crc kubenswrapper[4807]: W1127 11:09:19.532582 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.532652 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.542753 4807 policy_none.go:49] "None policy: Start" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.543574 4807 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.543600 4807 state_mem.go:35] "Initializing new in-memory state store" Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.586901 4807 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.593481 4807 manager.go:334] "Starting Device Plugin manager" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.593561 4807 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.593578 4807 server.go:79] "Starting device plugin registration server" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.595289 4807 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.595324 4807 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.595884 4807 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.596020 4807 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.596032 4807 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.601809 4807 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.631203 4807 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.631372 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.632724 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.632782 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.632793 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.632965 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.633403 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.633469 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.633797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.633846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.633860 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.634107 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.634238 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.634298 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.634675 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.634707 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.634718 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.634947 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.634965 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.634975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.635174 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.635456 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.635519 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.635613 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.635779 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.635814 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.636507 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.636540 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.636554 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.636780 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.636811 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.636825 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.636967 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.637188 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.637238 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.637665 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.637685 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.637693 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.637888 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.637911 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.638279 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.638300 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.638308 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.638460 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.638488 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.638501 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.688036 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="400ms" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.697083 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.698570 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.698626 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.698636 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.698669 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.699873 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.705757 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.705799 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.705823 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.705842 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.705862 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.705879 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.705894 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.705912 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.705976 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.706012 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.706030 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.706081 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.706125 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.706155 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.706173 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807185 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807237 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807274 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807315 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807333 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807350 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807364 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807377 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807384 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807527 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807393 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807598 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807613 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807627 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807640 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807655 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807506 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807671 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807458 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807484 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807483 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807741 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807502 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807509 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807454 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807825 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807847 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807849 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807861 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.807879 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.901008 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.902084 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.902127 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.902136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.902162 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 11:09:19 crc kubenswrapper[4807]: E1127 11:09:19.902651 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.957286 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.962769 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.981985 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:19 crc kubenswrapper[4807]: I1127 11:09:19.996313 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.002928 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 27 11:09:20 crc kubenswrapper[4807]: W1127 11:09:20.003403 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8d6bfe0dd49b9a16bfcd7ce62dcaf8db12c5f0a7061247e2967e2a1956e3ba42 WatchSource:0}: Error finding container 8d6bfe0dd49b9a16bfcd7ce62dcaf8db12c5f0a7061247e2967e2a1956e3ba42: Status 404 returned error can't find the container with id 8d6bfe0dd49b9a16bfcd7ce62dcaf8db12c5f0a7061247e2967e2a1956e3ba42 Nov 27 11:09:20 crc kubenswrapper[4807]: W1127 11:09:20.004397 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-3093a70b36d2fa96393ad73245f6c435b79ad2a4996ba0459c214cad8898684e WatchSource:0}: Error finding container 3093a70b36d2fa96393ad73245f6c435b79ad2a4996ba0459c214cad8898684e: Status 404 returned error can't find the container with id 3093a70b36d2fa96393ad73245f6c435b79ad2a4996ba0459c214cad8898684e Nov 27 11:09:20 crc kubenswrapper[4807]: W1127 11:09:20.010095 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-4643abfa6e583eb658978778c1de0cb8fce8b0db5a58ab4cf64acd1eb49660f8 WatchSource:0}: Error finding container 4643abfa6e583eb658978778c1de0cb8fce8b0db5a58ab4cf64acd1eb49660f8: Status 404 returned error can't find the container with id 4643abfa6e583eb658978778c1de0cb8fce8b0db5a58ab4cf64acd1eb49660f8 Nov 27 11:09:20 crc kubenswrapper[4807]: W1127 11:09:20.016431 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-13bbae557bf4d8cb53e8939ac5649b0eef28be98b0f8c86272249227f969dc3a WatchSource:0}: Error finding container 13bbae557bf4d8cb53e8939ac5649b0eef28be98b0f8c86272249227f969dc3a: Status 404 returned error can't find the container with id 13bbae557bf4d8cb53e8939ac5649b0eef28be98b0f8c86272249227f969dc3a Nov 27 11:09:20 crc kubenswrapper[4807]: W1127 11:09:20.017358 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-be99032fd4933838f5bd3e9ec46c13b2816a09fac67c1d4ae0517e35b75d0c61 WatchSource:0}: Error finding container be99032fd4933838f5bd3e9ec46c13b2816a09fac67c1d4ae0517e35b75d0c61: Status 404 returned error can't find the container with id be99032fd4933838f5bd3e9ec46c13b2816a09fac67c1d4ae0517e35b75d0c61 Nov 27 11:09:20 crc kubenswrapper[4807]: E1127 11:09:20.089734 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="800ms" Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.303045 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.304666 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.304698 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.304706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.304727 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 11:09:20 crc kubenswrapper[4807]: E1127 11:09:20.305086 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Nov 27 11:09:20 crc kubenswrapper[4807]: W1127 11:09:20.432407 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:20 crc kubenswrapper[4807]: E1127 11:09:20.432481 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 11:09:20 crc kubenswrapper[4807]: W1127 11:09:20.459290 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:20 crc kubenswrapper[4807]: E1127 11:09:20.459361 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.478430 4807 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.486674 4807 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 18:01:52.625647131 +0000 UTC Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.536277 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3093a70b36d2fa96393ad73245f6c435b79ad2a4996ba0459c214cad8898684e"} Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.537281 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"be99032fd4933838f5bd3e9ec46c13b2816a09fac67c1d4ae0517e35b75d0c61"} Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.538297 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"13bbae557bf4d8cb53e8939ac5649b0eef28be98b0f8c86272249227f969dc3a"} Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.539806 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4643abfa6e583eb658978778c1de0cb8fce8b0db5a58ab4cf64acd1eb49660f8"} Nov 27 11:09:20 crc kubenswrapper[4807]: I1127 11:09:20.541579 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8d6bfe0dd49b9a16bfcd7ce62dcaf8db12c5f0a7061247e2967e2a1956e3ba42"} Nov 27 11:09:20 crc kubenswrapper[4807]: W1127 11:09:20.549310 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:20 crc kubenswrapper[4807]: E1127 11:09:20.549382 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 11:09:20 crc kubenswrapper[4807]: W1127 11:09:20.863543 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:20 crc kubenswrapper[4807]: E1127 11:09:20.863697 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 11:09:20 crc kubenswrapper[4807]: E1127 11:09:20.891343 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="1.6s" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.106099 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.107392 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.107428 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.107437 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.107464 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 11:09:21 crc kubenswrapper[4807]: E1127 11:09:21.108042 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.479013 4807 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.487277 4807 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 16:42:01.327439412 +0000 UTC Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.487356 4807 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 821h32m39.840086968s for next certificate rotation Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.548397 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156" exitCode=0 Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.548564 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.548588 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156"} Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.549687 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.549711 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.549722 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.550165 4807 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc" exitCode=0 Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.550264 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc"} Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.550295 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.551062 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.551084 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.551096 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.551187 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.552149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.552181 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.552190 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.552671 4807 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207" exitCode=0 Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.552714 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.552739 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207"} Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.553461 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.553489 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.553498 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.557741 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d"} Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.557799 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d"} Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.557816 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba"} Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.557830 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb"} Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.557764 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.558772 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.558803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.558811 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.558964 4807 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc" exitCode=0 Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.558998 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc"} Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.559072 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.559908 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.559940 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:21 crc kubenswrapper[4807]: I1127 11:09:21.559952 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.478743 4807 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:22 crc kubenswrapper[4807]: E1127 11:09:22.492651 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="3.2s" Nov 27 11:09:22 crc kubenswrapper[4807]: W1127 11:09:22.533604 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:22 crc kubenswrapper[4807]: E1127 11:09:22.533683 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.564740 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b"} Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.564779 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71"} Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.564789 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b"} Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.564798 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd"} Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.564807 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b"} Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.564819 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.565647 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.565674 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.565684 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.566434 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5855d42391ccbd3f32e7abf944c071e0912ec43fa3137269b842b95e6907b209"} Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.566459 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.567014 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.567037 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.567045 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.568865 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"284dd629e6c81f1232a2d56007a2a7471423b7a601c38bcd3bc264ab9586fc53"} Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.568909 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"88a8b1a2034b437496a624127bf754d60abab11457ad19e6e074fe454d0e21b5"} Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.568925 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8d33545207add221fa61a7c8259b245fa2f114f53ef101d74503d4bbadad20fd"} Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.568914 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.569691 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.569710 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.569720 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.570393 4807 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e" exitCode=0 Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.570423 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e"} Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.570466 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.570502 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.571189 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.571210 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.571218 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.571217 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.571281 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.571291 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:22 crc kubenswrapper[4807]: E1127 11:09:22.630193 4807 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187bd8886b858168 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 11:09:19.476744552 +0000 UTC m=+0.576242760,LastTimestamp:2025-11-27 11:09:19.476744552 +0000 UTC m=+0.576242760,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.709208 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.710283 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.710326 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.710337 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:22 crc kubenswrapper[4807]: I1127 11:09:22.710364 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 11:09:22 crc kubenswrapper[4807]: E1127 11:09:22.710766 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.204:6443: connect: connection refused" node="crc" Nov 27 11:09:22 crc kubenswrapper[4807]: W1127 11:09:22.792810 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.204:6443: connect: connection refused Nov 27 11:09:22 crc kubenswrapper[4807]: E1127 11:09:22.792879 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.204:6443: connect: connection refused" logger="UnhandledError" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.143669 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.576036 4807 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0" exitCode=0 Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.576275 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.576307 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.576363 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0"} Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.576436 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.576307 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.576479 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.576504 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.576567 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578033 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578078 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578094 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578157 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578192 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578209 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578514 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578559 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578577 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578586 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578609 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578620 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578584 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:23 crc kubenswrapper[4807]: I1127 11:09:23.578648 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.039264 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.584491 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a"} Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.584582 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2"} Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.584616 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1"} Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.584646 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d"} Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.584649 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.584593 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.584749 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.586668 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.586704 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.586717 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.586788 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.586831 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:24 crc kubenswrapper[4807]: I1127 11:09:24.586850 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.315228 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.590979 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.591028 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.591030 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5"} Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.591998 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.592023 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.592035 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.592129 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.592189 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.592208 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.910894 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.912162 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.912193 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.912201 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:25 crc kubenswrapper[4807]: I1127 11:09:25.912223 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 11:09:26 crc kubenswrapper[4807]: I1127 11:09:26.144668 4807 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 27 11:09:26 crc kubenswrapper[4807]: I1127 11:09:26.144752 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 11:09:26 crc kubenswrapper[4807]: I1127 11:09:26.593155 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:26 crc kubenswrapper[4807]: I1127 11:09:26.593372 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:26 crc kubenswrapper[4807]: I1127 11:09:26.595816 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:26 crc kubenswrapper[4807]: I1127 11:09:26.595859 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:26 crc kubenswrapper[4807]: I1127 11:09:26.595869 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:26 crc kubenswrapper[4807]: I1127 11:09:26.596395 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:26 crc kubenswrapper[4807]: I1127 11:09:26.596459 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:26 crc kubenswrapper[4807]: I1127 11:09:26.596472 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:28 crc kubenswrapper[4807]: I1127 11:09:28.046542 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:28 crc kubenswrapper[4807]: I1127 11:09:28.046727 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:28 crc kubenswrapper[4807]: I1127 11:09:28.047765 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:28 crc kubenswrapper[4807]: I1127 11:09:28.047810 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:28 crc kubenswrapper[4807]: I1127 11:09:28.047822 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.477843 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.477990 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.479236 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.479308 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.479320 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:29 crc kubenswrapper[4807]: E1127 11:09:29.601923 4807 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.711562 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.717876 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.895887 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.896122 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.897512 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.897717 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:29 crc kubenswrapper[4807]: I1127 11:09:29.897901 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:30 crc kubenswrapper[4807]: I1127 11:09:30.051395 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:30 crc kubenswrapper[4807]: I1127 11:09:30.052227 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:30 crc kubenswrapper[4807]: I1127 11:09:30.052328 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:30 crc kubenswrapper[4807]: I1127 11:09:30.052346 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:30 crc kubenswrapper[4807]: I1127 11:09:30.056534 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:30 crc kubenswrapper[4807]: I1127 11:09:30.195695 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 27 11:09:30 crc kubenswrapper[4807]: I1127 11:09:30.195886 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:30 crc kubenswrapper[4807]: I1127 11:09:30.196936 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:30 crc kubenswrapper[4807]: I1127 11:09:30.196968 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:30 crc kubenswrapper[4807]: I1127 11:09:30.196979 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:31 crc kubenswrapper[4807]: I1127 11:09:31.054011 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:31 crc kubenswrapper[4807]: I1127 11:09:31.055207 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:31 crc kubenswrapper[4807]: I1127 11:09:31.055374 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:31 crc kubenswrapper[4807]: I1127 11:09:31.055402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:32 crc kubenswrapper[4807]: I1127 11:09:32.056032 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:32 crc kubenswrapper[4807]: I1127 11:09:32.057593 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:32 crc kubenswrapper[4807]: I1127 11:09:32.057654 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:32 crc kubenswrapper[4807]: I1127 11:09:32.057675 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:33 crc kubenswrapper[4807]: W1127 11:09:33.260451 4807 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 27 11:09:33 crc kubenswrapper[4807]: I1127 11:09:33.260533 4807 trace.go:236] Trace[271782830]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 11:09:23.258) (total time: 10001ms): Nov 27 11:09:33 crc kubenswrapper[4807]: Trace[271782830]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:09:33.260) Nov 27 11:09:33 crc kubenswrapper[4807]: Trace[271782830]: [10.001524241s] [10.001524241s] END Nov 27 11:09:33 crc kubenswrapper[4807]: E1127 11:09:33.260553 4807 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 27 11:09:33 crc kubenswrapper[4807]: I1127 11:09:33.347322 4807 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 27 11:09:33 crc kubenswrapper[4807]: I1127 11:09:33.347395 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 27 11:09:33 crc kubenswrapper[4807]: I1127 11:09:33.351740 4807 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 27 11:09:33 crc kubenswrapper[4807]: I1127 11:09:33.351793 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 27 11:09:34 crc kubenswrapper[4807]: I1127 11:09:34.044986 4807 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]log ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]etcd ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/openshift.io-api-request-count-filter ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/openshift.io-startkubeinformers ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/start-apiserver-admission-initializer ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/generic-apiserver-start-informers ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/priority-and-fairness-config-consumer ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/priority-and-fairness-filter ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/storage-object-count-tracker-hook ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/start-apiextensions-informers ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/start-apiextensions-controllers ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/crd-informer-synced ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/start-system-namespaces-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/start-cluster-authentication-info-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/start-legacy-token-tracking-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/start-service-ip-repair-controllers ok Nov 27 11:09:34 crc kubenswrapper[4807]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Nov 27 11:09:34 crc kubenswrapper[4807]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/priority-and-fairness-config-producer ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/bootstrap-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/start-kube-aggregator-informers ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/apiservice-status-local-available-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/apiservice-status-remote-available-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/apiservice-registration-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/apiservice-wait-for-first-sync ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/apiservice-discovery-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/kube-apiserver-autoregistration ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]autoregister-completion ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/apiservice-openapi-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: [+]poststarthook/apiservice-openapiv3-controller ok Nov 27 11:09:34 crc kubenswrapper[4807]: livez check failed Nov 27 11:09:34 crc kubenswrapper[4807]: I1127 11:09:34.045065 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:09:34 crc kubenswrapper[4807]: I1127 11:09:34.062495 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 27 11:09:34 crc kubenswrapper[4807]: I1127 11:09:34.064710 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b" exitCode=255 Nov 27 11:09:34 crc kubenswrapper[4807]: I1127 11:09:34.064778 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b"} Nov 27 11:09:34 crc kubenswrapper[4807]: I1127 11:09:34.065028 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:34 crc kubenswrapper[4807]: I1127 11:09:34.066318 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:34 crc kubenswrapper[4807]: I1127 11:09:34.066373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:34 crc kubenswrapper[4807]: I1127 11:09:34.066421 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:34 crc kubenswrapper[4807]: I1127 11:09:34.067238 4807 scope.go:117] "RemoveContainer" containerID="08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b" Nov 27 11:09:35 crc kubenswrapper[4807]: I1127 11:09:35.071264 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 27 11:09:35 crc kubenswrapper[4807]: I1127 11:09:35.072808 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977"} Nov 27 11:09:35 crc kubenswrapper[4807]: I1127 11:09:35.072941 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:35 crc kubenswrapper[4807]: I1127 11:09:35.073705 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:35 crc kubenswrapper[4807]: I1127 11:09:35.073740 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:35 crc kubenswrapper[4807]: I1127 11:09:35.073752 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:35 crc kubenswrapper[4807]: I1127 11:09:35.316190 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:36 crc kubenswrapper[4807]: I1127 11:09:36.074871 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:36 crc kubenswrapper[4807]: I1127 11:09:36.075607 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:36 crc kubenswrapper[4807]: I1127 11:09:36.075639 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:36 crc kubenswrapper[4807]: I1127 11:09:36.075651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:36 crc kubenswrapper[4807]: I1127 11:09:36.144512 4807 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 27 11:09:36 crc kubenswrapper[4807]: I1127 11:09:36.144580 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 27 11:09:38 crc kubenswrapper[4807]: E1127 11:09:38.338046 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 27 11:09:38 crc kubenswrapper[4807]: I1127 11:09:38.342707 4807 trace.go:236] Trace[261301699]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 11:09:26.036) (total time: 12306ms): Nov 27 11:09:38 crc kubenswrapper[4807]: Trace[261301699]: ---"Objects listed" error: 12306ms (11:09:38.342) Nov 27 11:09:38 crc kubenswrapper[4807]: Trace[261301699]: [12.306226289s] [12.306226289s] END Nov 27 11:09:38 crc kubenswrapper[4807]: I1127 11:09:38.342741 4807 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 27 11:09:38 crc kubenswrapper[4807]: I1127 11:09:38.342848 4807 trace.go:236] Trace[1725226523]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 11:09:23.516) (total time: 14826ms): Nov 27 11:09:38 crc kubenswrapper[4807]: Trace[1725226523]: ---"Objects listed" error: 14826ms (11:09:38.342) Nov 27 11:09:38 crc kubenswrapper[4807]: Trace[1725226523]: [14.826082301s] [14.826082301s] END Nov 27 11:09:38 crc kubenswrapper[4807]: I1127 11:09:38.342860 4807 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 27 11:09:38 crc kubenswrapper[4807]: I1127 11:09:38.343441 4807 trace.go:236] Trace[1125452988]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Nov-2025 11:09:28.046) (total time: 10296ms): Nov 27 11:09:38 crc kubenswrapper[4807]: Trace[1125452988]: ---"Objects listed" error: 10296ms (11:09:38.343) Nov 27 11:09:38 crc kubenswrapper[4807]: Trace[1125452988]: [10.296798717s] [10.296798717s] END Nov 27 11:09:38 crc kubenswrapper[4807]: I1127 11:09:38.343461 4807 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 27 11:09:38 crc kubenswrapper[4807]: E1127 11:09:38.344659 4807 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 27 11:09:38 crc kubenswrapper[4807]: I1127 11:09:38.344693 4807 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 27 11:09:38 crc kubenswrapper[4807]: I1127 11:09:38.461769 4807 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.044755 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.049491 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.052379 4807 apiserver.go:52] "Watching apiserver" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.055571 4807 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.055954 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-d4dvd","openshift-machine-config-operator/machine-config-daemon-kk425","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-lwph9","openshift-multus/multus-additional-cni-plugins-k6wll","openshift-multus/multus-xmngf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h"] Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.056286 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.056321 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.056375 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.056493 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.056579 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.056781 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.057192 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.057465 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d4dvd" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.057569 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.057732 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.057780 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.057782 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.057967 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.058446 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.061624 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.061902 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.062417 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.063004 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.063051 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.063750 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.063790 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.065036 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.066150 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.066662 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.066735 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.066812 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.067408 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.068287 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.069517 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.069660 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.069798 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.069861 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.069936 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070009 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070046 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070111 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070123 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070151 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070132 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070155 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070219 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070290 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070565 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.070663 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.077175 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.088061 4807 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.094111 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.106310 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.116046 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.124477 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.133206 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.141557 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149010 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149051 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149068 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149084 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149100 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149115 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149156 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149173 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149337 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149423 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149499 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.149513 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150107 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150148 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150155 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150205 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150224 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150397 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150426 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150458 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150493 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150918 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150505 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150958 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150675 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150798 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.150865 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.151300 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.151363 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.151385 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.151483 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:09:39.651465622 +0000 UTC m=+20.750963820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.151848 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.151886 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.151923 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.151946 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.151996 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.152331 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.152506 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.152893 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.152960 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.152980 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.153419 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.153201 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.153360 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.153488 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.153513 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154257 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154278 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154312 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154331 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154489 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154676 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154346 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154700 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154731 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154752 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.154764 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155100 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155172 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155198 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155208 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155229 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155528 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155578 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155597 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155613 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155864 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155803 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155815 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155832 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155938 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155955 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.155969 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156144 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156196 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156237 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156285 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156303 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156203 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156317 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156790 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156752 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156756 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156808 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156815 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156874 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.156890 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157129 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157230 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157294 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157265 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157240 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157285 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157356 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157375 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157411 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157427 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157441 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157459 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157475 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157489 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157503 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157517 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157533 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157550 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157570 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157588 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157605 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157622 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157641 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157656 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157671 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157688 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157703 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157720 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157740 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157755 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157770 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157788 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157803 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157818 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157834 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157849 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157865 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157880 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157926 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157945 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157962 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157971 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157982 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.157997 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158013 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158029 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158043 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158057 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158072 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158090 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158105 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158119 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158130 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158135 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158161 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158179 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158195 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158211 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158227 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158298 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158316 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158331 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158347 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158365 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158382 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158391 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158398 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158424 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158442 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158457 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158472 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158487 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158558 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158620 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158640 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158646 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158657 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158657 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158673 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158689 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158704 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158723 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158745 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158755 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158767 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158789 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158835 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158854 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158870 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158890 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158913 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158935 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158956 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158971 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.158986 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159003 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159020 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159036 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159052 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159077 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159408 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159057 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159457 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159090 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159487 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159518 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159560 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159571 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159591 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159595 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159619 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159697 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159706 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159769 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159808 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159841 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159879 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159913 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159942 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159975 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160007 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159839 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159948 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160041 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.159977 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160005 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160072 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160093 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160101 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160129 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160165 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160198 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160229 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160276 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160307 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160318 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160344 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160369 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160406 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160436 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160440 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160513 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160709 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160739 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160776 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160806 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160842 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160866 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160890 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160924 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160947 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160972 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161001 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161036 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161061 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161093 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161127 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161156 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161196 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161229 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161273 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161310 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161348 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161381 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161410 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161448 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161483 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161511 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161560 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161596 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161628 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161649 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161672 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161696 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161717 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161741 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161769 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161793 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161927 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-proxy-tls\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161956 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-cnibin\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161978 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-bin\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161999 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-script-lib\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162017 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-rootfs\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162037 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-system-cni-dir\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162064 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162100 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-var-lib-kubelet\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162126 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-os-release\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162150 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovn-node-metrics-cert\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162175 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-var-lib-cni-multus\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162192 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-netns\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162212 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-systemd\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162231 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-var-lib-openvswitch\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162284 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-node-log\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162313 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162337 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162359 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97f15cbb-220e-47db-b418-3a5aa4eb55a2-cni-binary-copy\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162392 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-etc-kubernetes\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162411 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-log-socket\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162433 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-run-k8s-cni-cncf-io\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162459 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162481 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162506 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksfsj\" (UniqueName: \"kubernetes.io/projected/e030331a-9097-479c-8226-8553c1423ae4-kube-api-access-ksfsj\") pod \"node-resolver-d4dvd\" (UID: \"e030331a-9097-479c-8226-8553c1423ae4\") " pod="openshift-dns/node-resolver-d4dvd" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162528 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/579992dc-49bf-49ea-ad07-62beba6397df-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162557 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-systemd-units\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162581 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162603 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e030331a-9097-479c-8226-8553c1423ae4-hosts-file\") pod \"node-resolver-d4dvd\" (UID: \"e030331a-9097-479c-8226-8553c1423ae4\") " pod="openshift-dns/node-resolver-d4dvd" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162622 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-ovn\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162642 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-os-release\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162667 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxfj\" (UniqueName: \"kubernetes.io/projected/97f15cbb-220e-47db-b418-3a5aa4eb55a2-kube-api-access-5bxfj\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162694 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162718 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dncsr\" (UniqueName: \"kubernetes.io/projected/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-kube-api-access-dncsr\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162737 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-env-overrides\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162759 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-run-multus-certs\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162780 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-system-cni-dir\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162803 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162821 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-kubelet\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162850 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-cni-dir\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162909 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162932 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162955 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w44dv\" (UniqueName: \"kubernetes.io/projected/579992dc-49bf-49ea-ad07-62beba6397df-kube-api-access-w44dv\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162973 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-openvswitch\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162994 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-daemon-config\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163013 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163034 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163053 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nmsn\" (UniqueName: \"kubernetes.io/projected/9c85b740-1df9-4ae7-a51b-fdfd89668d64-kube-api-access-7nmsn\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163074 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-conf-dir\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163094 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163116 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-var-lib-cni-bin\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163136 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163298 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-etc-openvswitch\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163321 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-netd\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163340 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-run-netns\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163371 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-hostroot\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163399 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-slash\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163423 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-cnibin\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163443 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-socket-dir-parent\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163464 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/579992dc-49bf-49ea-ad07-62beba6397df-cni-binary-copy\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163483 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163506 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163527 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-config\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163552 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163575 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163689 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163703 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163719 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163730 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163741 4807 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163755 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163768 4807 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163778 4807 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163789 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163802 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163813 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163826 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163837 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163850 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163861 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163872 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163885 4807 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163896 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163906 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163917 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163930 4807 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163940 4807 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163952 4807 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163966 4807 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163980 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163990 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164001 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164015 4807 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164026 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164036 4807 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164047 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164060 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164071 4807 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164081 4807 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164090 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164104 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164114 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164126 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164139 4807 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164148 4807 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164158 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164172 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164188 4807 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164200 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164217 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164232 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164260 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164271 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164281 4807 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164292 4807 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164306 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164317 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164326 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164339 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164350 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164366 4807 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164394 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164411 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164422 4807 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164433 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164442 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164456 4807 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164466 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164478 4807 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164492 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164502 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.169683 4807 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160626 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160704 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.160998 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161229 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161357 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161381 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161411 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161539 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161736 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161799 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161866 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171151 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171196 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171346 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171414 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171487 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171700 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171803 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171839 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171908 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171929 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.171946 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.172028 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.172278 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.172349 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.172493 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.172493 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.172509 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162069 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162085 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162259 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162319 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162371 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162427 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162520 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.172601 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162680 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162695 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162718 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162731 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.162769 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163080 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163261 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163294 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163341 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163613 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163801 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.163846 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164021 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164095 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164608 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164655 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164694 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164873 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.164872 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.165057 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.165137 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.166385 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.166432 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.166595 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.167338 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.167368 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.168043 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.168285 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.168572 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.168595 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.168784 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.168891 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.169021 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.169202 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.169344 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.169598 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.169619 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.169889 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.170081 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.170275 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.170342 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.170416 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.170478 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.170508 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.170527 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.170820 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.170831 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.172638 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.172968 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.173090 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.173121 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.173144 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.173234 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:39.673212315 +0000 UTC m=+20.772710513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.173334 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.173427 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.173461 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.173521 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:39.673503433 +0000 UTC m=+20.773001701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.173685 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.173911 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.174020 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.174003 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.174104 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.174304 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.174329 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.174624 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.174825 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.174852 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.175033 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.175133 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.175488 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.175728 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.175924 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.176038 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.176115 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.176128 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.176612 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.176435 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.176704 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.177500 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.161964 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.181797 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.181831 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.181846 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.181918 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:39.681897352 +0000 UTC m=+20.781395630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.182593 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.183478 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.183502 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.183514 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.183559 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:39.683547427 +0000 UTC m=+20.783045745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.184644 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.186700 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.190495 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.190823 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.190825 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.190949 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.191801 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.191849 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.192338 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.192648 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.195401 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.197461 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.202897 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.203396 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.205856 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.206682 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.209126 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.209890 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.210347 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.214314 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.214493 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.214885 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.220457 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.237461 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.243660 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.244954 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.247950 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.256873 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265577 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265630 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-etc-openvswitch\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265655 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-netd\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265678 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265703 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-var-lib-cni-bin\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265708 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-etc-openvswitch\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265725 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-slash\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265745 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-cnibin\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265743 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-netd\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265766 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-socket-dir-parent\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265786 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-run-netns\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265784 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-var-lib-cni-bin\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265822 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-cnibin\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265841 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-hostroot\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265757 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265860 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-config\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265873 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-run-netns\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265882 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/579992dc-49bf-49ea-ad07-62beba6397df-cni-binary-copy\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265893 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-hostroot\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265886 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-slash\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265906 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265980 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-cnibin\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266286 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-bin\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266313 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-script-lib\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266332 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-rootfs\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266352 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-proxy-tls\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266382 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-system-cni-dir\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266391 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-bin\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265997 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-socket-dir-parent\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266446 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-system-cni-dir\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266398 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-rootfs\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266431 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-var-lib-kubelet\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266411 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-var-lib-kubelet\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266533 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/579992dc-49bf-49ea-ad07-62beba6397df-cni-binary-copy\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.265996 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-cnibin\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266561 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-os-release\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266585 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266611 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/579992dc-49bf-49ea-ad07-62beba6397df-os-release\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266593 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovn-node-metrics-cert\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266655 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-netns\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266670 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-systemd\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266686 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-var-lib-openvswitch\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266702 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-node-log\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266712 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-systemd\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266719 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-var-lib-cni-multus\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266780 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-netns\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266818 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-node-log\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266821 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-var-lib-cni-multus\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266839 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-var-lib-openvswitch\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266855 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-script-lib\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266859 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97f15cbb-220e-47db-b418-3a5aa4eb55a2-cni-binary-copy\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266891 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-etc-kubernetes\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266906 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-log-socket\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266922 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-etc-kubernetes\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266922 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266951 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksfsj\" (UniqueName: \"kubernetes.io/projected/e030331a-9097-479c-8226-8553c1423ae4-kube-api-access-ksfsj\") pod \"node-resolver-d4dvd\" (UID: \"e030331a-9097-479c-8226-8553c1423ae4\") " pod="openshift-dns/node-resolver-d4dvd" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266967 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/579992dc-49bf-49ea-ad07-62beba6397df-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.266985 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-systemd-units\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267001 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-log-socket\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267002 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-run-k8s-cni-cncf-io\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267022 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-run-k8s-cni-cncf-io\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267063 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-ovn\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267007 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267073 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-systemd-units\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267081 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-os-release\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267136 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-os-release\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267179 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e030331a-9097-479c-8226-8553c1423ae4-hosts-file\") pod \"node-resolver-d4dvd\" (UID: \"e030331a-9097-479c-8226-8553c1423ae4\") " pod="openshift-dns/node-resolver-d4dvd" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267111 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-ovn\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267213 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxfj\" (UniqueName: \"kubernetes.io/projected/97f15cbb-220e-47db-b418-3a5aa4eb55a2-kube-api-access-5bxfj\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267280 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dncsr\" (UniqueName: \"kubernetes.io/projected/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-kube-api-access-dncsr\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267308 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-env-overrides\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267335 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-run-multus-certs\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267356 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-system-cni-dir\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267386 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-kubelet\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267411 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w44dv\" (UniqueName: \"kubernetes.io/projected/579992dc-49bf-49ea-ad07-62beba6397df-kube-api-access-w44dv\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267433 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-openvswitch\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267453 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-cni-dir\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267472 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267473 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97f15cbb-220e-47db-b418-3a5aa4eb55a2-cni-binary-copy\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267483 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-system-cni-dir\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267490 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267480 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-config\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267507 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-openvswitch\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267516 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nmsn\" (UniqueName: \"kubernetes.io/projected/9c85b740-1df9-4ae7-a51b-fdfd89668d64-kube-api-access-7nmsn\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267539 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-conf-dir\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267559 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-ovn-kubernetes\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267563 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-daemon-config\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267622 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-host-run-multus-certs\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267283 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e030331a-9097-479c-8226-8553c1423ae4-hosts-file\") pod \"node-resolver-d4dvd\" (UID: \"e030331a-9097-479c-8226-8553c1423ae4\") " pod="openshift-dns/node-resolver-d4dvd" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267539 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-kubelet\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267692 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-cni-dir\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267698 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267713 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-conf-dir\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267739 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/579992dc-49bf-49ea-ad07-62beba6397df-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267809 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-mcd-auth-proxy-config\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267860 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267868 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-env-overrides\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267875 4807 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267932 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267946 4807 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267956 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267965 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267976 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267988 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.267997 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268006 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268014 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268023 4807 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268032 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268040 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268049 4807 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268059 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268068 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268076 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268085 4807 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268093 4807 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268102 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268111 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268122 4807 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268131 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268140 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268148 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268156 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268166 4807 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268175 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268184 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268192 4807 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268223 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268232 4807 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268255 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268264 4807 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268273 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268281 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268289 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268297 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268306 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268314 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268322 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268330 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268339 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268348 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268359 4807 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268367 4807 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268375 4807 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268383 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268392 4807 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268400 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268409 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268417 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268425 4807 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268434 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268474 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268483 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268492 4807 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268500 4807 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268511 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268521 4807 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268529 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268545 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268550 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97f15cbb-220e-47db-b418-3a5aa4eb55a2-multus-daemon-config\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268555 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268619 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268634 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268646 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268658 4807 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268698 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268711 4807 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268722 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268734 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268770 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268783 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268794 4807 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268804 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268817 4807 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268853 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268866 4807 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268878 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268888 4807 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268900 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268937 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268950 4807 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268965 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.268976 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269012 4807 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269025 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269037 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269048 4807 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269058 4807 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269069 4807 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269105 4807 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269118 4807 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269132 4807 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269144 4807 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269182 4807 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269195 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269206 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269218 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269229 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269282 4807 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269294 4807 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269306 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269318 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269330 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269366 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269378 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269389 4807 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269400 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269411 4807 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269447 4807 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269458 4807 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269469 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269481 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269492 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269536 4807 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269550 4807 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269563 4807 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269575 4807 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269618 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269630 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269642 4807 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269652 4807 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269661 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269694 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269702 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.269712 4807 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.270023 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovn-node-metrics-cert\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.272116 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-proxy-tls\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.281045 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxfj\" (UniqueName: \"kubernetes.io/projected/97f15cbb-220e-47db-b418-3a5aa4eb55a2-kube-api-access-5bxfj\") pod \"multus-xmngf\" (UID: \"97f15cbb-220e-47db-b418-3a5aa4eb55a2\") " pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.282696 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nmsn\" (UniqueName: \"kubernetes.io/projected/9c85b740-1df9-4ae7-a51b-fdfd89668d64-kube-api-access-7nmsn\") pod \"ovnkube-node-lwph9\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.283814 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dncsr\" (UniqueName: \"kubernetes.io/projected/aaae6992-39ea-4c99-b5e5-b4c025ec48f7-kube-api-access-dncsr\") pod \"machine-config-daemon-kk425\" (UID: \"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\") " pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.288310 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w44dv\" (UniqueName: \"kubernetes.io/projected/579992dc-49bf-49ea-ad07-62beba6397df-kube-api-access-w44dv\") pod \"multus-additional-cni-plugins-k6wll\" (UID: \"579992dc-49bf-49ea-ad07-62beba6397df\") " pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.289622 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksfsj\" (UniqueName: \"kubernetes.io/projected/e030331a-9097-479c-8226-8553c1423ae4-kube-api-access-ksfsj\") pod \"node-resolver-d4dvd\" (UID: \"e030331a-9097-479c-8226-8553c1423ae4\") " pod="openshift-dns/node-resolver-d4dvd" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.375269 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.379471 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.389995 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.398392 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d4dvd" Nov 27 11:09:39 crc kubenswrapper[4807]: W1127 11:09:39.402520 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-4249b9be89b584c0848593b33f5ed84e821cae4972b89a3d8f7ce941dfa686f2 WatchSource:0}: Error finding container 4249b9be89b584c0848593b33f5ed84e821cae4972b89a3d8f7ce941dfa686f2: Status 404 returned error can't find the container with id 4249b9be89b584c0848593b33f5ed84e821cae4972b89a3d8f7ce941dfa686f2 Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.404823 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xmngf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.413694 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.420657 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.426862 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k6wll" Nov 27 11:09:39 crc kubenswrapper[4807]: W1127 11:09:39.451543 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f15cbb_220e_47db_b418_3a5aa4eb55a2.slice/crio-98c9f320e0e7563bd477ecb00ca164eda33848ac45f23c38578ed3a2ec8ff093 WatchSource:0}: Error finding container 98c9f320e0e7563bd477ecb00ca164eda33848ac45f23c38578ed3a2ec8ff093: Status 404 returned error can't find the container with id 98c9f320e0e7563bd477ecb00ca164eda33848ac45f23c38578ed3a2ec8ff093 Nov 27 11:09:39 crc kubenswrapper[4807]: W1127 11:09:39.478168 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c85b740_1df9_4ae7_a51b_fdfd89668d64.slice/crio-0a5dff762776b30ad73f861bcc7e54e32ac6c0768a16a96458c906d041cee704 WatchSource:0}: Error finding container 0a5dff762776b30ad73f861bcc7e54e32ac6c0768a16a96458c906d041cee704: Status 404 returned error can't find the container with id 0a5dff762776b30ad73f861bcc7e54e32ac6c0768a16a96458c906d041cee704 Nov 27 11:09:39 crc kubenswrapper[4807]: W1127 11:09:39.486752 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaae6992_39ea_4c99_b5e5_b4c025ec48f7.slice/crio-aa1a4c8be10bfbd564fb7d703197011050c3c0551e46ee7d5f8860e8028fa0db WatchSource:0}: Error finding container aa1a4c8be10bfbd564fb7d703197011050c3c0551e46ee7d5f8860e8028fa0db: Status 404 returned error can't find the container with id aa1a4c8be10bfbd564fb7d703197011050c3c0551e46ee7d5f8860e8028fa0db Nov 27 11:09:39 crc kubenswrapper[4807]: W1127 11:09:39.496198 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod579992dc_49bf_49ea_ad07_62beba6397df.slice/crio-bc47aa3ffd4c4bc859027b96212a7bb8b34374759331a0f0a66f9605944b4146 WatchSource:0}: Error finding container bc47aa3ffd4c4bc859027b96212a7bb8b34374759331a0f0a66f9605944b4146: Status 404 returned error can't find the container with id bc47aa3ffd4c4bc859027b96212a7bb8b34374759331a0f0a66f9605944b4146 Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.536426 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.536948 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.538040 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.538636 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.539534 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.540037 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.540601 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.541793 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.542582 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.543810 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.544466 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.545406 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.545829 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.546436 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.547081 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.548295 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.548993 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.554426 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.554935 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.555879 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.556283 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.557043 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.557805 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.559147 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.559917 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.562997 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.563588 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.564953 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.565890 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.566659 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.567845 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.568457 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.569556 4807 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.569685 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.572001 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.573367 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.573738 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.573907 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.576453 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.578521 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.579190 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.580573 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.581446 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.582623 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.583545 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.584588 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.584980 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.585831 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.587193 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.587906 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.589169 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.590266 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.591496 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.592529 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.593167 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.593915 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.595677 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.596319 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.596874 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.617159 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.633805 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.666603 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.675643 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.675780 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.675826 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:09:40.675810763 +0000 UTC m=+21.775308961 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.675837 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.675855 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.675866 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:40.675857264 +0000 UTC m=+21.775355462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.675921 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.675942 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:40.675936556 +0000 UTC m=+21.775434754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.678814 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.679706 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c85b740_1df9_4ae7_a51b_fdfd89668d64.slice/crio-dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c85b740_1df9_4ae7_a51b_fdfd89668d64.slice/crio-conmon-dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447.scope\": RecentStats: unable to find data in memory cache]" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.694053 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.704165 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.713225 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.776290 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.776339 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.776480 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.776498 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.776510 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.776560 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:40.776544641 +0000 UTC m=+21.876042839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.776886 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.776907 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.776917 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:39 crc kubenswrapper[4807]: E1127 11:09:39.776969 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:40.776951242 +0000 UTC m=+21.876449530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.920935 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.930616 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.932213 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.933821 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.941949 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.951210 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.966746 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.978212 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:39 crc kubenswrapper[4807]: I1127 11:09:39.987948 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.000793 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.016415 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.046430 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.060795 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.074111 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.084281 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.084328 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.084338 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"09e4436107c0ed0c64b0fb1cf7bd23db28282e23dff591e9fb2637fc11b42d72"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.090049 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.091346 4807 generic.go:334] "Generic (PLEG): container finished" podID="579992dc-49bf-49ea-ad07-62beba6397df" containerID="e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed" exitCode=0 Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.091437 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" event={"ID":"579992dc-49bf-49ea-ad07-62beba6397df","Type":"ContainerDied","Data":"e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.091467 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" event={"ID":"579992dc-49bf-49ea-ad07-62beba6397df","Type":"ContainerStarted","Data":"bc47aa3ffd4c4bc859027b96212a7bb8b34374759331a0f0a66f9605944b4146"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.094438 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmngf" event={"ID":"97f15cbb-220e-47db-b418-3a5aa4eb55a2","Type":"ContainerStarted","Data":"396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.094478 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmngf" event={"ID":"97f15cbb-220e-47db-b418-3a5aa4eb55a2","Type":"ContainerStarted","Data":"98c9f320e0e7563bd477ecb00ca164eda33848ac45f23c38578ed3a2ec8ff093"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.098855 4807 generic.go:334] "Generic (PLEG): container finished" podID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerID="dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447" exitCode=0 Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.098924 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.098955 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"0a5dff762776b30ad73f861bcc7e54e32ac6c0768a16a96458c906d041cee704"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.101716 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d4dvd" event={"ID":"e030331a-9097-479c-8226-8553c1423ae4","Type":"ContainerStarted","Data":"65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.101784 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d4dvd" event={"ID":"e030331a-9097-479c-8226-8553c1423ae4","Type":"ContainerStarted","Data":"098b3e0f96f0dff7cae1a3ef6570541f93a060bc6d83f7a2601adbcc067f4858"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.107376 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.107422 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c4853b0940d3100d9faac9862370331351454fbd5bec074925c895de2a29dbd7"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.109563 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.109607 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.109619 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"aa1a4c8be10bfbd564fb7d703197011050c3c0551e46ee7d5f8860e8028fa0db"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.111175 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4249b9be89b584c0848593b33f5ed84e821cae4972b89a3d8f7ce941dfa686f2"} Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.118295 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.134080 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.134381 4807 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.154934 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.165065 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.175991 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.187353 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.201269 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.234011 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.248960 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.260037 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.270877 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.281222 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.301835 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.324130 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.337291 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.348738 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.367786 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.382524 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.401059 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.413316 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.428269 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.448345 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.459778 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.473325 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.485542 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.507586 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:40Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.532281 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.532348 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.532422 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.532412 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.532740 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.532806 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.688290 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.688415 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:09:42.688393132 +0000 UTC m=+23.787891330 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.688452 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.688489 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.688618 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.688663 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:42.688650769 +0000 UTC m=+23.788148967 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.688674 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.688704 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:42.68869546 +0000 UTC m=+23.788193658 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.789578 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:40 crc kubenswrapper[4807]: I1127 11:09:40.789775 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.789720 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.789847 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.789859 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.789907 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.789914 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:42.78989887 +0000 UTC m=+23.889397068 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.789922 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.789934 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:40 crc kubenswrapper[4807]: E1127 11:09:40.789977 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:42.789962642 +0000 UTC m=+23.889460840 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.117078 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6"} Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.117129 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103"} Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.117142 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416"} Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.117168 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4"} Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.117181 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581"} Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.117191 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e"} Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.118764 4807 generic.go:334] "Generic (PLEG): container finished" podID="579992dc-49bf-49ea-ad07-62beba6397df" containerID="9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f" exitCode=0 Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.118857 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" event={"ID":"579992dc-49bf-49ea-ad07-62beba6397df","Type":"ContainerDied","Data":"9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f"} Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.136597 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.157718 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.174908 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.192988 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.214361 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.240289 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.266193 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.304827 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.322693 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.346171 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.365071 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.377344 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.388957 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.518188 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5fv8w"] Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.518527 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.521057 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.522033 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.522143 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.522667 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.535186 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.546486 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.559885 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.571730 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.588684 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.602785 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.614538 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.626991 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.644086 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.655540 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.667072 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.678506 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.691299 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.698695 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7be9a58d-0876-441d-b6eb-6d0b3412abac-host\") pod \"node-ca-5fv8w\" (UID: \"7be9a58d-0876-441d-b6eb-6d0b3412abac\") " pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.698728 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7g6r\" (UniqueName: \"kubernetes.io/projected/7be9a58d-0876-441d-b6eb-6d0b3412abac-kube-api-access-d7g6r\") pod \"node-ca-5fv8w\" (UID: \"7be9a58d-0876-441d-b6eb-6d0b3412abac\") " pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.698769 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7be9a58d-0876-441d-b6eb-6d0b3412abac-serviceca\") pod \"node-ca-5fv8w\" (UID: \"7be9a58d-0876-441d-b6eb-6d0b3412abac\") " pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.701381 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:41Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.799200 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7be9a58d-0876-441d-b6eb-6d0b3412abac-host\") pod \"node-ca-5fv8w\" (UID: \"7be9a58d-0876-441d-b6eb-6d0b3412abac\") " pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.799238 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7g6r\" (UniqueName: \"kubernetes.io/projected/7be9a58d-0876-441d-b6eb-6d0b3412abac-kube-api-access-d7g6r\") pod \"node-ca-5fv8w\" (UID: \"7be9a58d-0876-441d-b6eb-6d0b3412abac\") " pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.799283 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7be9a58d-0876-441d-b6eb-6d0b3412abac-serviceca\") pod \"node-ca-5fv8w\" (UID: \"7be9a58d-0876-441d-b6eb-6d0b3412abac\") " pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.799423 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7be9a58d-0876-441d-b6eb-6d0b3412abac-host\") pod \"node-ca-5fv8w\" (UID: \"7be9a58d-0876-441d-b6eb-6d0b3412abac\") " pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.800190 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7be9a58d-0876-441d-b6eb-6d0b3412abac-serviceca\") pod \"node-ca-5fv8w\" (UID: \"7be9a58d-0876-441d-b6eb-6d0b3412abac\") " pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.818561 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7g6r\" (UniqueName: \"kubernetes.io/projected/7be9a58d-0876-441d-b6eb-6d0b3412abac-kube-api-access-d7g6r\") pod \"node-ca-5fv8w\" (UID: \"7be9a58d-0876-441d-b6eb-6d0b3412abac\") " pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: I1127 11:09:41.833662 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5fv8w" Nov 27 11:09:41 crc kubenswrapper[4807]: W1127 11:09:41.845286 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7be9a58d_0876_441d_b6eb_6d0b3412abac.slice/crio-59ad8dad6b4d2104941fb25953f4a06bdc424a858bbe5c3bb1403f260a4ad6dc WatchSource:0}: Error finding container 59ad8dad6b4d2104941fb25953f4a06bdc424a858bbe5c3bb1403f260a4ad6dc: Status 404 returned error can't find the container with id 59ad8dad6b4d2104941fb25953f4a06bdc424a858bbe5c3bb1403f260a4ad6dc Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.124872 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5fv8w" event={"ID":"7be9a58d-0876-441d-b6eb-6d0b3412abac","Type":"ContainerStarted","Data":"60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13"} Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.124922 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5fv8w" event={"ID":"7be9a58d-0876-441d-b6eb-6d0b3412abac","Type":"ContainerStarted","Data":"59ad8dad6b4d2104941fb25953f4a06bdc424a858bbe5c3bb1403f260a4ad6dc"} Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.126461 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958"} Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.134944 4807 generic.go:334] "Generic (PLEG): container finished" podID="579992dc-49bf-49ea-ad07-62beba6397df" containerID="ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690" exitCode=0 Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.135042 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" event={"ID":"579992dc-49bf-49ea-ad07-62beba6397df","Type":"ContainerDied","Data":"ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690"} Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.142259 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.149678 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.165464 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.177822 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.198730 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.213026 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.227189 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.237906 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.249470 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.266933 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.279433 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.294461 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.305208 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.317725 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.330264 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.338961 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.354455 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.377316 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.421585 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.457769 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.497570 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.531998 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.532050 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.532007 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.532115 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.532185 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.532265 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.534966 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.586832 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.616195 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.657969 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.696632 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.709148 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.709320 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.709337 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:09:46.709317143 +0000 UTC m=+27.808815341 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.709367 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.709466 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.709468 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.709510 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:46.709500038 +0000 UTC m=+27.808998236 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.709525 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:46.709518618 +0000 UTC m=+27.809016816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.738219 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.779824 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:42Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.809881 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:42 crc kubenswrapper[4807]: I1127 11:09:42.809935 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.810030 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.810034 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.810051 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.810056 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.810062 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.810067 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.810119 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:46.810095991 +0000 UTC m=+27.909594199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:42 crc kubenswrapper[4807]: E1127 11:09:42.810135 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:46.810128192 +0000 UTC m=+27.909626390 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.141341 4807 generic.go:334] "Generic (PLEG): container finished" podID="579992dc-49bf-49ea-ad07-62beba6397df" containerID="191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7" exitCode=0 Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.141540 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" event={"ID":"579992dc-49bf-49ea-ad07-62beba6397df","Type":"ContainerDied","Data":"191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7"} Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.150531 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3"} Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.152121 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.157052 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.157234 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.159554 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.168821 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.181160 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.192564 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.202768 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.212879 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.233159 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.245021 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.262414 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.272807 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.289720 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.304340 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.316136 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.360059 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.399430 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.437814 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.475295 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.522555 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.561076 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.599450 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.648325 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.676659 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.715017 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.762212 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.801133 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.846074 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.877655 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.923587 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:43 crc kubenswrapper[4807]: I1127 11:09:43.958140 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:43Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.156421 4807 generic.go:334] "Generic (PLEG): container finished" podID="579992dc-49bf-49ea-ad07-62beba6397df" containerID="683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb" exitCode=0 Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.156521 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" event={"ID":"579992dc-49bf-49ea-ad07-62beba6397df","Type":"ContainerDied","Data":"683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb"} Nov 27 11:09:44 crc kubenswrapper[4807]: E1127 11:09:44.162314 4807 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.170143 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.181335 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.193834 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.217739 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.231216 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.244922 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.277169 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.304284 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.336380 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.380593 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.419730 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.457857 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.501821 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.531755 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.531800 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.531755 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:44 crc kubenswrapper[4807]: E1127 11:09:44.531910 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:44 crc kubenswrapper[4807]: E1127 11:09:44.532004 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:44 crc kubenswrapper[4807]: E1127 11:09:44.532091 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.538371 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.578227 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.745282 4807 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.747631 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.747669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.747681 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.747800 4807 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.754152 4807 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.754476 4807 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.755494 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.755525 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.755536 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.755549 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.755561 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:44Z","lastTransitionTime":"2025-11-27T11:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:44 crc kubenswrapper[4807]: E1127 11:09:44.768530 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.772582 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.772654 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.772674 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.772701 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.772722 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:44Z","lastTransitionTime":"2025-11-27T11:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:44 crc kubenswrapper[4807]: E1127 11:09:44.790083 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.794622 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.794665 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.794683 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.794706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.794724 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:44Z","lastTransitionTime":"2025-11-27T11:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:44 crc kubenswrapper[4807]: E1127 11:09:44.812437 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.817302 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.817358 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.817375 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.817397 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.817413 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:44Z","lastTransitionTime":"2025-11-27T11:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:44 crc kubenswrapper[4807]: E1127 11:09:44.832480 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.837677 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.837717 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.837728 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.837745 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.837756 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:44Z","lastTransitionTime":"2025-11-27T11:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:44 crc kubenswrapper[4807]: E1127 11:09:44.852076 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:44Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:44 crc kubenswrapper[4807]: E1127 11:09:44.852263 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.853768 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.853803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.853814 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.853830 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.853841 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:44Z","lastTransitionTime":"2025-11-27T11:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.998230 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.998288 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.998297 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.998311 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:44 crc kubenswrapper[4807]: I1127 11:09:44.998319 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:44Z","lastTransitionTime":"2025-11-27T11:09:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.100457 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.100503 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.100511 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.100525 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.100534 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:45Z","lastTransitionTime":"2025-11-27T11:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.165228 4807 generic.go:334] "Generic (PLEG): container finished" podID="579992dc-49bf-49ea-ad07-62beba6397df" containerID="5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5" exitCode=0 Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.165999 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" event={"ID":"579992dc-49bf-49ea-ad07-62beba6397df","Type":"ContainerDied","Data":"5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5"} Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.187924 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.207035 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.207370 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.207388 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.207410 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.207427 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:45Z","lastTransitionTime":"2025-11-27T11:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.210411 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.230577 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.248411 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.258367 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.268645 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.287992 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.300941 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.310787 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.310816 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.310825 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.310837 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.310845 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:45Z","lastTransitionTime":"2025-11-27T11:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.314304 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.318847 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.327473 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.341626 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.353538 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.365974 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.377821 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.391066 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.409781 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.414378 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.414409 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.414417 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.414432 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.414440 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:45Z","lastTransitionTime":"2025-11-27T11:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.421063 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.432774 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.443525 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.455902 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.466039 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.497027 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.516344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.516375 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.516384 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.516396 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.516405 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:45Z","lastTransitionTime":"2025-11-27T11:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.535835 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.579115 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.618503 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.618544 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.618553 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.618571 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.618593 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:45Z","lastTransitionTime":"2025-11-27T11:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.620737 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.662622 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.698109 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.721147 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.721197 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.721209 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.721226 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.721239 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:45Z","lastTransitionTime":"2025-11-27T11:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.740826 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.776349 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.816340 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:45Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.827911 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.827946 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.827954 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.827966 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.827978 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:45Z","lastTransitionTime":"2025-11-27T11:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.930796 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.930843 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.930854 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.930870 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:45 crc kubenswrapper[4807]: I1127 11:09:45.930883 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:45Z","lastTransitionTime":"2025-11-27T11:09:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.036724 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.037214 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.037239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.037303 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.037326 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:46Z","lastTransitionTime":"2025-11-27T11:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.140581 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.140637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.140691 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.140716 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.140735 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:46Z","lastTransitionTime":"2025-11-27T11:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.174858 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.175298 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.175363 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.180728 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" event={"ID":"579992dc-49bf-49ea-ad07-62beba6397df","Type":"ContainerStarted","Data":"769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.191599 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.208106 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.208333 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.208495 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.238936 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.243410 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.243479 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.243497 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.243521 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.243541 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:46Z","lastTransitionTime":"2025-11-27T11:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.257203 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.272499 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.286011 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.306006 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.320993 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.335942 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.346575 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.346606 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.346615 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.346629 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.346638 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:46Z","lastTransitionTime":"2025-11-27T11:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.353502 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.374002 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.395316 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.430719 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.450038 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.450081 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.450092 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.450110 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.450121 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:46Z","lastTransitionTime":"2025-11-27T11:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.460194 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.476397 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.495801 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.516177 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.531683 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.531717 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.531837 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.531991 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.532287 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.532382 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.540980 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.553939 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.553979 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.553999 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.554017 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.554028 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:46Z","lastTransitionTime":"2025-11-27T11:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.583442 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.617209 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.653216 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.656793 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.656829 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.656837 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.656851 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.656860 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:46Z","lastTransitionTime":"2025-11-27T11:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.695223 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.736606 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.756913 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.757103 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:09:54.757083956 +0000 UTC m=+35.856582144 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.757161 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.757210 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.757326 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.757344 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.757391 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:54.757381944 +0000 UTC m=+35.856880142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.757407 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:54.757400765 +0000 UTC m=+35.856898963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.758610 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.758640 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.758650 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.758662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.758671 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:46Z","lastTransitionTime":"2025-11-27T11:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.775489 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.824068 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.858699 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.858736 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.858868 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.858886 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.858896 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.858936 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:54.858924163 +0000 UTC m=+35.958422361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.859206 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.859217 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.859224 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:46 crc kubenswrapper[4807]: E1127 11:09:46.859266 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:54.859238631 +0000 UTC m=+35.958736829 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.860118 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.862188 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.862230 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.862260 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.862277 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.862287 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:46Z","lastTransitionTime":"2025-11-27T11:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.897646 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.936143 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.964360 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.964404 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.964413 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.964426 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.964435 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:46Z","lastTransitionTime":"2025-11-27T11:09:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:46 crc kubenswrapper[4807]: I1127 11:09:46.979617 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:46Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.014925 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:47Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.066541 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.066588 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.066600 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.066630 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.066640 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:47Z","lastTransitionTime":"2025-11-27T11:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.169084 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.169144 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.169162 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.169186 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.169202 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:47Z","lastTransitionTime":"2025-11-27T11:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.183173 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.272157 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.272219 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.272239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.272296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.272314 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:47Z","lastTransitionTime":"2025-11-27T11:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.375452 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.375507 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.375523 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.375545 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.375561 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:47Z","lastTransitionTime":"2025-11-27T11:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.478080 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.478118 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.478126 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.478140 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.478150 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:47Z","lastTransitionTime":"2025-11-27T11:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.580373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.580416 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.580429 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.580448 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.580461 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:47Z","lastTransitionTime":"2025-11-27T11:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.682502 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.682535 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.682545 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.682559 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.682567 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:47Z","lastTransitionTime":"2025-11-27T11:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.784560 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.784604 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.784614 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.784627 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.784636 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:47Z","lastTransitionTime":"2025-11-27T11:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.887235 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.887285 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.887293 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.887305 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.887320 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:47Z","lastTransitionTime":"2025-11-27T11:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.990344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.990389 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.990398 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.990413 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:47 crc kubenswrapper[4807]: I1127 11:09:47.990424 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:47Z","lastTransitionTime":"2025-11-27T11:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.092284 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.092332 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.092344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.092361 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.092372 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:48Z","lastTransitionTime":"2025-11-27T11:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.186016 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.194845 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.194868 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.194876 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.194890 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.194899 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:48Z","lastTransitionTime":"2025-11-27T11:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.297522 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.297571 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.297583 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.297601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.297612 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:48Z","lastTransitionTime":"2025-11-27T11:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.400087 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.400154 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.400171 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.400194 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.400210 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:48Z","lastTransitionTime":"2025-11-27T11:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.502043 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.502110 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.502120 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.502138 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.502149 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:48Z","lastTransitionTime":"2025-11-27T11:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.531280 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:48 crc kubenswrapper[4807]: E1127 11:09:48.531457 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.531370 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:48 crc kubenswrapper[4807]: E1127 11:09:48.531630 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.531316 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:48 crc kubenswrapper[4807]: E1127 11:09:48.531785 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.604095 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.604125 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.604133 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.604145 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.604154 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:48Z","lastTransitionTime":"2025-11-27T11:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.706982 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.707223 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.707302 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.707368 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.707427 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:48Z","lastTransitionTime":"2025-11-27T11:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.809946 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.809994 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.810012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.810037 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.810055 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:48Z","lastTransitionTime":"2025-11-27T11:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.912686 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.912756 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.912774 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.912799 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:48 crc kubenswrapper[4807]: I1127 11:09:48.912816 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:48Z","lastTransitionTime":"2025-11-27T11:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.015082 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.015312 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.015373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.015439 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.015508 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:49Z","lastTransitionTime":"2025-11-27T11:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.117661 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.117698 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.117708 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.117726 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.117736 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:49Z","lastTransitionTime":"2025-11-27T11:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.190569 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/0.log" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.193653 4807 generic.go:334] "Generic (PLEG): container finished" podID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerID="3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39" exitCode=1 Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.193705 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.194556 4807 scope.go:117] "RemoveContainer" containerID="3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.206077 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.219979 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.220023 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.220036 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.220053 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.220066 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:49Z","lastTransitionTime":"2025-11-27T11:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.221490 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.239808 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:48Z\\\",\\\"message\\\":\\\"de event handler 7\\\\nI1127 11:09:48.186699 6048 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.186777 6048 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186823 6048 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186951 6048 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187257 6048 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187361 6048 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187497 6048 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187892 6048 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.254473 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.266135 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.274777 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.286975 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.298087 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.320623 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.322560 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.322642 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.322653 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.322671 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.322683 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:49Z","lastTransitionTime":"2025-11-27T11:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.332980 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.346095 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.356790 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.370261 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.381260 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.389609 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.425159 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.425386 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.425445 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.425504 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.425570 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:49Z","lastTransitionTime":"2025-11-27T11:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.531684 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.532518 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.532597 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.532703 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.532788 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:49Z","lastTransitionTime":"2025-11-27T11:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.549544 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.565217 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.575155 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.587975 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.599777 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.617844 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:48Z\\\",\\\"message\\\":\\\"de event handler 7\\\\nI1127 11:09:48.186699 6048 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.186777 6048 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186823 6048 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186951 6048 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187257 6048 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187361 6048 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187497 6048 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187892 6048 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.635269 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.635296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.635304 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.635316 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.635324 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:49Z","lastTransitionTime":"2025-11-27T11:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.636706 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.649999 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.659990 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.684490 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.704576 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.719199 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.732181 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.737476 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.737522 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.737534 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.737553 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.737564 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:49Z","lastTransitionTime":"2025-11-27T11:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.751890 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.763037 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:49Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.839820 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.839856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.839867 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.839881 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.839891 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:49Z","lastTransitionTime":"2025-11-27T11:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.942230 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.942300 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.942308 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.942321 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:49 crc kubenswrapper[4807]: I1127 11:09:49.942330 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:49Z","lastTransitionTime":"2025-11-27T11:09:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.044679 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.044708 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.044716 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.044728 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.044736 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:50Z","lastTransitionTime":"2025-11-27T11:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.146583 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.146611 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.146619 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.146632 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.146641 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:50Z","lastTransitionTime":"2025-11-27T11:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.198013 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/1.log" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.198539 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/0.log" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.201594 4807 generic.go:334] "Generic (PLEG): container finished" podID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerID="c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe" exitCode=1 Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.201633 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.201672 4807 scope.go:117] "RemoveContainer" containerID="3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.202198 4807 scope.go:117] "RemoveContainer" containerID="c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe" Nov 27 11:09:50 crc kubenswrapper[4807]: E1127 11:09:50.202393 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.215224 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.225408 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.243466 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.248900 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.248924 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.248932 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.248945 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.248955 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:50Z","lastTransitionTime":"2025-11-27T11:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.254896 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.272423 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.286376 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.304635 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.314824 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.326488 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.338121 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.347714 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.351415 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.351449 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.351461 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.351478 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.351491 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:50Z","lastTransitionTime":"2025-11-27T11:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.360664 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.373687 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.390183 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:48Z\\\",\\\"message\\\":\\\"de event handler 7\\\\nI1127 11:09:48.186699 6048 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.186777 6048 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186823 6048 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186951 6048 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187257 6048 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187361 6048 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187497 6048 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187892 6048 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:50Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130557 6190 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130819 6190 obj_retry.go:551] Creating *factory.egressNode crc took: 6.095286ms\\\\nI1127 11:09:50.130841 6190 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 11:09:50.130863 6190 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 11:09:50.131300 6190 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 11:09:50.131455 6190 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 11:09:50.131502 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:09:50.131529 6190 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 11:09:50.131618 6190 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.401991 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:50Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.454537 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.454603 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.454629 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.454670 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.454695 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:50Z","lastTransitionTime":"2025-11-27T11:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.531399 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.531399 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:50 crc kubenswrapper[4807]: E1127 11:09:50.531588 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:50 crc kubenswrapper[4807]: E1127 11:09:50.531724 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.531434 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:50 crc kubenswrapper[4807]: E1127 11:09:50.531856 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.557028 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.557081 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.557097 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.557119 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.557137 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:50Z","lastTransitionTime":"2025-11-27T11:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.660122 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.660315 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.660344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.660542 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.660744 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:50Z","lastTransitionTime":"2025-11-27T11:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.763932 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.763983 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.763994 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.764012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.764024 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:50Z","lastTransitionTime":"2025-11-27T11:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.867133 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.867210 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.867234 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.867296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.867319 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:50Z","lastTransitionTime":"2025-11-27T11:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.970232 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.970305 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.970313 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.970330 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:50 crc kubenswrapper[4807]: I1127 11:09:50.970341 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:50Z","lastTransitionTime":"2025-11-27T11:09:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.011035 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f"] Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.011654 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.013737 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.014408 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.024557 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.038882 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.044228 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fc25371-fd6f-439c-b3e0-415f96822338-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.044310 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcnq4\" (UniqueName: \"kubernetes.io/projected/8fc25371-fd6f-439c-b3e0-415f96822338-kube-api-access-mcnq4\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.044366 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fc25371-fd6f-439c-b3e0-415f96822338-env-overrides\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.044393 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fc25371-fd6f-439c-b3e0-415f96822338-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.049745 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.064562 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.072482 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.072524 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.072536 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.072554 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.072567 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:51Z","lastTransitionTime":"2025-11-27T11:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.078619 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.097736 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.127276 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:48Z\\\",\\\"message\\\":\\\"de event handler 7\\\\nI1127 11:09:48.186699 6048 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.186777 6048 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186823 6048 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186951 6048 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187257 6048 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187361 6048 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187497 6048 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187892 6048 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:50Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130557 6190 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130819 6190 obj_retry.go:551] Creating *factory.egressNode crc took: 6.095286ms\\\\nI1127 11:09:50.130841 6190 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 11:09:50.130863 6190 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 11:09:50.131300 6190 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 11:09:50.131455 6190 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 11:09:50.131502 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:09:50.131529 6190 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 11:09:50.131618 6190 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.144832 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fc25371-fd6f-439c-b3e0-415f96822338-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.144862 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcnq4\" (UniqueName: \"kubernetes.io/projected/8fc25371-fd6f-439c-b3e0-415f96822338-kube-api-access-mcnq4\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.144892 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fc25371-fd6f-439c-b3e0-415f96822338-env-overrides\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.144908 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fc25371-fd6f-439c-b3e0-415f96822338-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.145573 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.146103 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8fc25371-fd6f-439c-b3e0-415f96822338-env-overrides\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.148012 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8fc25371-fd6f-439c-b3e0-415f96822338-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.155149 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8fc25371-fd6f-439c-b3e0-415f96822338-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.163514 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.168726 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcnq4\" (UniqueName: \"kubernetes.io/projected/8fc25371-fd6f-439c-b3e0-415f96822338-kube-api-access-mcnq4\") pod \"ovnkube-control-plane-749d76644c-btg9f\" (UID: \"8fc25371-fd6f-439c-b3e0-415f96822338\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.175598 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.175624 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.175632 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.175646 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.175654 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:51Z","lastTransitionTime":"2025-11-27T11:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.181113 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.203808 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.206384 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/1.log" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.219528 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.232591 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.249884 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.261659 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.274313 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:51Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.277637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.277662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.277672 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.277683 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.277692 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:51Z","lastTransitionTime":"2025-11-27T11:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.327090 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" Nov 27 11:09:51 crc kubenswrapper[4807]: W1127 11:09:51.346913 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fc25371_fd6f_439c_b3e0_415f96822338.slice/crio-e8e3f1d0127fdda123f4e3c15cd5f9f12a7ab7c44bf9cc4a90f706d1736bd1bb WatchSource:0}: Error finding container e8e3f1d0127fdda123f4e3c15cd5f9f12a7ab7c44bf9cc4a90f706d1736bd1bb: Status 404 returned error can't find the container with id e8e3f1d0127fdda123f4e3c15cd5f9f12a7ab7c44bf9cc4a90f706d1736bd1bb Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.380091 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.380168 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.380180 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.380198 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.380211 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:51Z","lastTransitionTime":"2025-11-27T11:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.482264 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.482309 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.482324 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.482340 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.482353 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:51Z","lastTransitionTime":"2025-11-27T11:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.585206 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.585334 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.585361 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.585390 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.585413 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:51Z","lastTransitionTime":"2025-11-27T11:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.687593 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.687719 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.687743 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.687775 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.687802 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:51Z","lastTransitionTime":"2025-11-27T11:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.790537 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.790580 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.790590 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.790608 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.790619 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:51Z","lastTransitionTime":"2025-11-27T11:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.893541 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.893588 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.893601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.893621 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.893633 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:51Z","lastTransitionTime":"2025-11-27T11:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.997563 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.997624 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.997638 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.997659 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:51 crc kubenswrapper[4807]: I1127 11:09:51.997673 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:51Z","lastTransitionTime":"2025-11-27T11:09:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.100763 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.100847 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.100861 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.100877 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.100888 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:52Z","lastTransitionTime":"2025-11-27T11:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.203973 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.204023 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.204038 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.204058 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.204074 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:52Z","lastTransitionTime":"2025-11-27T11:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.213388 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" event={"ID":"8fc25371-fd6f-439c-b3e0-415f96822338","Type":"ContainerStarted","Data":"8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.213445 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" event={"ID":"8fc25371-fd6f-439c-b3e0-415f96822338","Type":"ContainerStarted","Data":"e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.213465 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" event={"ID":"8fc25371-fd6f-439c-b3e0-415f96822338","Type":"ContainerStarted","Data":"e8e3f1d0127fdda123f4e3c15cd5f9f12a7ab7c44bf9cc4a90f706d1736bd1bb"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.224866 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.239758 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.250422 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.262605 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.274547 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.291712 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:48Z\\\",\\\"message\\\":\\\"de event handler 7\\\\nI1127 11:09:48.186699 6048 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.186777 6048 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186823 6048 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186951 6048 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187257 6048 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187361 6048 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187497 6048 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187892 6048 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:50Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130557 6190 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130819 6190 obj_retry.go:551] Creating *factory.egressNode crc took: 6.095286ms\\\\nI1127 11:09:50.130841 6190 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 11:09:50.130863 6190 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 11:09:50.131300 6190 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 11:09:50.131455 6190 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 11:09:50.131502 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:09:50.131529 6190 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 11:09:50.131618 6190 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.306082 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.307011 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.307048 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.307060 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.307074 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.307084 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:52Z","lastTransitionTime":"2025-11-27T11:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.316570 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.329295 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.342799 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.352999 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.370703 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.381731 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.391874 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.403225 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.408691 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.408762 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.408776 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.408795 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.408808 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:52Z","lastTransitionTime":"2025-11-27T11:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.415440 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.509805 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wszmz"] Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.510492 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:52 crc kubenswrapper[4807]: E1127 11:09:52.510580 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.510629 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.510660 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.510669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.510683 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.510692 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:52Z","lastTransitionTime":"2025-11-27T11:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.526005 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.532055 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.532125 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:52 crc kubenswrapper[4807]: E1127 11:09:52.532171 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:52 crc kubenswrapper[4807]: E1127 11:09:52.532307 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.532415 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:52 crc kubenswrapper[4807]: E1127 11:09:52.532517 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.535393 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.547881 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.557406 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.557501 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.557563 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjkg5\" (UniqueName: \"kubernetes.io/projected/911bce2f-3fb2-484d-870f-d9737047bd10-kube-api-access-sjkg5\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.567127 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.582801 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.596144 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.606517 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.613201 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.613319 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.613420 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.613495 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.613556 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:52Z","lastTransitionTime":"2025-11-27T11:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.616675 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.627839 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.639062 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.648523 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.658068 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.658160 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjkg5\" (UniqueName: \"kubernetes.io/projected/911bce2f-3fb2-484d-870f-d9737047bd10-kube-api-access-sjkg5\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.658185 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:52 crc kubenswrapper[4807]: E1127 11:09:52.658313 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:09:52 crc kubenswrapper[4807]: E1127 11:09:52.658366 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs podName:911bce2f-3fb2-484d-870f-d9737047bd10 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:53.158352295 +0000 UTC m=+34.257850493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs") pod "network-metrics-daemon-wszmz" (UID: "911bce2f-3fb2-484d-870f-d9737047bd10") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.669447 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.677235 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjkg5\" (UniqueName: \"kubernetes.io/projected/911bce2f-3fb2-484d-870f-d9737047bd10-kube-api-access-sjkg5\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.682193 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.699273 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:48Z\\\",\\\"message\\\":\\\"de event handler 7\\\\nI1127 11:09:48.186699 6048 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.186777 6048 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186823 6048 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186951 6048 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187257 6048 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187361 6048 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187497 6048 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187892 6048 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:50Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130557 6190 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130819 6190 obj_retry.go:551] Creating *factory.egressNode crc took: 6.095286ms\\\\nI1127 11:09:50.130841 6190 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 11:09:50.130863 6190 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 11:09:50.131300 6190 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 11:09:50.131455 6190 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 11:09:50.131502 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:09:50.131529 6190 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 11:09:50.131618 6190 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.710867 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:52Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.715191 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.715217 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.715226 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.715238 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.715293 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:52Z","lastTransitionTime":"2025-11-27T11:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.817678 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.817714 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.817726 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.817742 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.817755 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:52Z","lastTransitionTime":"2025-11-27T11:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.920608 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.920650 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.920662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.920679 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:52 crc kubenswrapper[4807]: I1127 11:09:52.920692 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:52Z","lastTransitionTime":"2025-11-27T11:09:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.023044 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.023087 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.023103 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.023126 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.023142 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:53Z","lastTransitionTime":"2025-11-27T11:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.126401 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.126483 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.126512 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.126542 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.126562 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:53Z","lastTransitionTime":"2025-11-27T11:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.162376 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:53 crc kubenswrapper[4807]: E1127 11:09:53.162535 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:09:53 crc kubenswrapper[4807]: E1127 11:09:53.162619 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs podName:911bce2f-3fb2-484d-870f-d9737047bd10 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:54.162596128 +0000 UTC m=+35.262094366 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs") pod "network-metrics-daemon-wszmz" (UID: "911bce2f-3fb2-484d-870f-d9737047bd10") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.228756 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.228801 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.228813 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.228832 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.228848 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:53Z","lastTransitionTime":"2025-11-27T11:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.331038 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.331106 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.331124 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.331149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.331166 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:53Z","lastTransitionTime":"2025-11-27T11:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.434446 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.434495 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.434507 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.434525 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.434537 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:53Z","lastTransitionTime":"2025-11-27T11:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.539128 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.539199 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.539234 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.539294 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.539314 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:53Z","lastTransitionTime":"2025-11-27T11:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.642429 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.642513 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.642532 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.642554 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.642572 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:53Z","lastTransitionTime":"2025-11-27T11:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.746310 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.746363 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.746375 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.746392 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.746403 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:53Z","lastTransitionTime":"2025-11-27T11:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.849188 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.849321 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.849344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.849375 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.849397 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:53Z","lastTransitionTime":"2025-11-27T11:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.952291 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.952385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.952405 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.952437 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:53 crc kubenswrapper[4807]: I1127 11:09:53.952506 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:53Z","lastTransitionTime":"2025-11-27T11:09:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.055280 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.055330 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.055342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.055359 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.055371 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:54Z","lastTransitionTime":"2025-11-27T11:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.158048 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.158100 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.158118 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.158138 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.158153 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:54Z","lastTransitionTime":"2025-11-27T11:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.172751 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.172888 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.172955 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs podName:911bce2f-3fb2-484d-870f-d9737047bd10 nodeName:}" failed. No retries permitted until 2025-11-27 11:09:56.172933485 +0000 UTC m=+37.272431693 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs") pod "network-metrics-daemon-wszmz" (UID: "911bce2f-3fb2-484d-870f-d9737047bd10") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.261032 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.261095 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.261120 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.261150 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.261172 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:54Z","lastTransitionTime":"2025-11-27T11:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.363826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.363884 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.363918 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.363953 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.363979 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:54Z","lastTransitionTime":"2025-11-27T11:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.467112 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.467182 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.467208 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.467239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.467308 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:54Z","lastTransitionTime":"2025-11-27T11:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.532088 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.532158 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.532317 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.532371 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.532385 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.532476 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.532612 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.532755 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.569380 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.569437 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.569453 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.569476 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.569493 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:54Z","lastTransitionTime":"2025-11-27T11:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.672678 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.672757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.672775 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.672886 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.672899 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:54Z","lastTransitionTime":"2025-11-27T11:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.775564 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.775609 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.775618 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.775631 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.775640 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:54Z","lastTransitionTime":"2025-11-27T11:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.778134 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.778217 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.778302 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.778378 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.778441 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:10:10.778402149 +0000 UTC m=+51.877900387 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.778511 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:10.778495352 +0000 UTC m=+51.877993590 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.778513 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.778636 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:10.778601915 +0000 UTC m=+51.878100143 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.879147 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.879313 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.879349 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.879373 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.879386 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.879448 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:10.879427575 +0000 UTC m=+51.978925783 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.879471 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.879499 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.879519 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:54 crc kubenswrapper[4807]: E1127 11:09:54.879586 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:10.879564589 +0000 UTC m=+51.979062827 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.879466 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.879630 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.879647 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.879671 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.879691 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:54Z","lastTransitionTime":"2025-11-27T11:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.982568 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.982626 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.982644 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.982668 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:54 crc kubenswrapper[4807]: I1127 11:09:54.982684 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:54Z","lastTransitionTime":"2025-11-27T11:09:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.085606 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.085670 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.085693 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.085722 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.085749 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.095464 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.095844 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.096091 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.096138 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.096161 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: E1127 11:09:55.118139 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:55Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.123709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.123776 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.123802 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.123834 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.123857 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: E1127 11:09:55.143202 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:55Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.149306 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.149390 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.149415 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.149441 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.149461 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: E1127 11:09:55.168051 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:55Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.172647 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.172701 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.172720 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.172751 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.172772 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: E1127 11:09:55.189072 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:55Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.194405 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.194487 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.194504 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.194527 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.194545 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: E1127 11:09:55.212457 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:55Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:55 crc kubenswrapper[4807]: E1127 11:09:55.212567 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.214334 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.214394 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.214415 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.214440 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.214459 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.317387 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.317715 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.317736 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.317762 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.317784 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.422576 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.422621 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.422632 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.422652 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.422666 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.526088 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.526148 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.526158 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.526176 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.526188 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.628342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.628398 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.628415 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.628444 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.628468 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.730620 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.730720 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.730732 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.730749 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.730761 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.833169 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.833426 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.833492 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.833569 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.833646 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.937400 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.937436 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.937447 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.937463 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:55 crc kubenswrapper[4807]: I1127 11:09:55.937474 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:55Z","lastTransitionTime":"2025-11-27T11:09:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.039874 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.039936 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.039952 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.039975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.039990 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:56Z","lastTransitionTime":"2025-11-27T11:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.142342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.142386 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.142397 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.142412 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.142426 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:56Z","lastTransitionTime":"2025-11-27T11:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.194206 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:56 crc kubenswrapper[4807]: E1127 11:09:56.194408 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:09:56 crc kubenswrapper[4807]: E1127 11:09:56.194708 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs podName:911bce2f-3fb2-484d-870f-d9737047bd10 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:00.194685729 +0000 UTC m=+41.294183927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs") pod "network-metrics-daemon-wszmz" (UID: "911bce2f-3fb2-484d-870f-d9737047bd10") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.244872 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.245089 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.245203 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.245294 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.245360 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:56Z","lastTransitionTime":"2025-11-27T11:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.347648 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.347688 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.347704 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.347719 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.347731 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:56Z","lastTransitionTime":"2025-11-27T11:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.450010 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.450052 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.450061 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.450074 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.450082 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:56Z","lastTransitionTime":"2025-11-27T11:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.532200 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.532281 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:56 crc kubenswrapper[4807]: E1127 11:09:56.532337 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.532349 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:56 crc kubenswrapper[4807]: E1127 11:09:56.532402 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.532435 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:56 crc kubenswrapper[4807]: E1127 11:09:56.532508 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:09:56 crc kubenswrapper[4807]: E1127 11:09:56.532573 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.552835 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.552874 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.552886 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.552902 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.552913 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:56Z","lastTransitionTime":"2025-11-27T11:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.656176 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.656226 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.656238 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.656284 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.656295 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:56Z","lastTransitionTime":"2025-11-27T11:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.758295 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.758332 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.758340 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.758352 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.758362 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:56Z","lastTransitionTime":"2025-11-27T11:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.862105 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.862148 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.862158 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.862176 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.862185 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:56Z","lastTransitionTime":"2025-11-27T11:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.965347 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.965416 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.965431 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.965457 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:56 crc kubenswrapper[4807]: I1127 11:09:56.965476 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:56Z","lastTransitionTime":"2025-11-27T11:09:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.068224 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.068488 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.068507 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.068533 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.068550 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:57Z","lastTransitionTime":"2025-11-27T11:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.172235 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.172325 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.172339 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.172360 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.172375 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:57Z","lastTransitionTime":"2025-11-27T11:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.275748 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.275803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.275834 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.275857 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.275870 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:57Z","lastTransitionTime":"2025-11-27T11:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.379300 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.379372 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.379385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.379406 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.379420 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:57Z","lastTransitionTime":"2025-11-27T11:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.483172 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.483237 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.483301 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.483333 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.483358 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:57Z","lastTransitionTime":"2025-11-27T11:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.585027 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.585085 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.585094 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.585108 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.585117 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:57Z","lastTransitionTime":"2025-11-27T11:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.688828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.688888 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.688911 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.688944 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.688968 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:57Z","lastTransitionTime":"2025-11-27T11:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.791764 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.791826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.791844 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.791869 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.791889 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:57Z","lastTransitionTime":"2025-11-27T11:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.894492 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.894575 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.894594 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.894618 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.894636 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:57Z","lastTransitionTime":"2025-11-27T11:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.997377 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.997438 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.997455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.997478 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:57 crc kubenswrapper[4807]: I1127 11:09:57.997494 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:57Z","lastTransitionTime":"2025-11-27T11:09:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.100271 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.100316 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.100328 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.100346 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.100361 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:58Z","lastTransitionTime":"2025-11-27T11:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.203162 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.203220 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.203239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.203296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.203312 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:58Z","lastTransitionTime":"2025-11-27T11:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.306448 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.306521 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.306539 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.306568 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.306586 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:58Z","lastTransitionTime":"2025-11-27T11:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.410037 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.410149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.410174 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.410203 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.410228 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:58Z","lastTransitionTime":"2025-11-27T11:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.512590 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.512636 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.512647 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.512662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.512674 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:58Z","lastTransitionTime":"2025-11-27T11:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.531411 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.531513 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.531518 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:09:58 crc kubenswrapper[4807]: E1127 11:09:58.531643 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.531673 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:09:58 crc kubenswrapper[4807]: E1127 11:09:58.531849 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:09:58 crc kubenswrapper[4807]: E1127 11:09:58.531992 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:09:58 crc kubenswrapper[4807]: E1127 11:09:58.532081 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.615632 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.615696 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.615720 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.615740 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.615754 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:58Z","lastTransitionTime":"2025-11-27T11:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.718737 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.718783 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.718795 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.718816 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.718828 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:58Z","lastTransitionTime":"2025-11-27T11:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.821463 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.821513 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.821524 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.821556 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.821569 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:58Z","lastTransitionTime":"2025-11-27T11:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.924499 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.924551 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.924562 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.924580 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:58 crc kubenswrapper[4807]: I1127 11:09:58.924593 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:58Z","lastTransitionTime":"2025-11-27T11:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.028072 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.028128 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.028146 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.028169 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.028277 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:59Z","lastTransitionTime":"2025-11-27T11:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.130304 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.130383 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.130402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.130426 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.130444 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:59Z","lastTransitionTime":"2025-11-27T11:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.235295 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.235345 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.235362 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.235384 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.235402 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:59Z","lastTransitionTime":"2025-11-27T11:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.338356 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.338437 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.338462 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.338487 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.338504 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:59Z","lastTransitionTime":"2025-11-27T11:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.441454 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.441501 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.441515 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.441534 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.441547 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:59Z","lastTransitionTime":"2025-11-27T11:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.543866 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.543929 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.543945 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.543965 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.543981 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:59Z","lastTransitionTime":"2025-11-27T11:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.553891 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.572444 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.587377 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.610001 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3af742092522811c2c008896733036361272a09485d366f75ecc62dd5d147d39\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:48Z\\\",\\\"message\\\":\\\"de event handler 7\\\\nI1127 11:09:48.186699 6048 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.186777 6048 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186823 6048 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1127 11:09:48.186951 6048 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187257 6048 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187361 6048 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1127 11:09:48.187497 6048 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1127 11:09:48.187892 6048 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:50Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130557 6190 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130819 6190 obj_retry.go:551] Creating *factory.egressNode crc took: 6.095286ms\\\\nI1127 11:09:50.130841 6190 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 11:09:50.130863 6190 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 11:09:50.131300 6190 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 11:09:50.131455 6190 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 11:09:50.131502 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:09:50.131529 6190 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 11:09:50.131618 6190 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.622669 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.633991 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.646150 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.646176 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.646282 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.646303 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.646315 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:59Z","lastTransitionTime":"2025-11-27T11:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.647326 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.656698 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.666372 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.676982 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.693311 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.703667 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.716512 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.728105 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.739414 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.749021 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.749077 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.749091 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.749109 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.749120 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:59Z","lastTransitionTime":"2025-11-27T11:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.749862 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.762939 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:09:59Z is after 2025-08-24T17:21:41Z" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.851753 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.851801 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.851813 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.851826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.851835 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:59Z","lastTransitionTime":"2025-11-27T11:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.954104 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.954136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.954146 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.954159 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:09:59 crc kubenswrapper[4807]: I1127 11:09:59.954167 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:09:59Z","lastTransitionTime":"2025-11-27T11:09:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.056557 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.056591 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.056603 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.056648 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.056659 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:00Z","lastTransitionTime":"2025-11-27T11:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.159574 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.159609 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.159620 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.159637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.159647 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:00Z","lastTransitionTime":"2025-11-27T11:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.235366 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:00 crc kubenswrapper[4807]: E1127 11:10:00.235544 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:10:00 crc kubenswrapper[4807]: E1127 11:10:00.235602 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs podName:911bce2f-3fb2-484d-870f-d9737047bd10 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:08.235587096 +0000 UTC m=+49.335085294 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs") pod "network-metrics-daemon-wszmz" (UID: "911bce2f-3fb2-484d-870f-d9737047bd10") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.261097 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.261123 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.261133 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.261145 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.261154 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:00Z","lastTransitionTime":"2025-11-27T11:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.363421 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.363455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.363463 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.363475 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.363483 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:00Z","lastTransitionTime":"2025-11-27T11:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.466722 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.466788 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.466813 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.466835 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.466851 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:00Z","lastTransitionTime":"2025-11-27T11:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.531627 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.531652 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.531670 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:00 crc kubenswrapper[4807]: E1127 11:10:00.531749 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.531818 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:00 crc kubenswrapper[4807]: E1127 11:10:00.531857 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:00 crc kubenswrapper[4807]: E1127 11:10:00.532000 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:00 crc kubenswrapper[4807]: E1127 11:10:00.532143 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.569200 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.569236 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.569271 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.569285 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.569295 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:00Z","lastTransitionTime":"2025-11-27T11:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.673990 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.674022 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.674032 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.674048 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.674058 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:00Z","lastTransitionTime":"2025-11-27T11:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.777000 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.777038 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.777047 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.777061 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.777072 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:00Z","lastTransitionTime":"2025-11-27T11:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.879044 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.879077 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.879088 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.879101 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.879109 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:00Z","lastTransitionTime":"2025-11-27T11:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.981207 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.981289 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.981297 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.981311 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:00 crc kubenswrapper[4807]: I1127 11:10:00.981319 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:00Z","lastTransitionTime":"2025-11-27T11:10:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.084506 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.084540 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.084548 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.084561 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.084569 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:01Z","lastTransitionTime":"2025-11-27T11:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.186829 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.186872 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.186885 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.186902 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.186915 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:01Z","lastTransitionTime":"2025-11-27T11:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.288659 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.288711 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.288722 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.288739 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.288751 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:01Z","lastTransitionTime":"2025-11-27T11:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.391549 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.391591 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.391602 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.391619 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.391631 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:01Z","lastTransitionTime":"2025-11-27T11:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.493938 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.493976 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.493986 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.494004 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.494014 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:01Z","lastTransitionTime":"2025-11-27T11:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.596569 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.596606 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.596618 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.596633 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.596643 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:01Z","lastTransitionTime":"2025-11-27T11:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.698799 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.698862 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.698884 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.698913 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.698933 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:01Z","lastTransitionTime":"2025-11-27T11:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.801010 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.801037 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.801045 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.801057 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.801065 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:01Z","lastTransitionTime":"2025-11-27T11:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.903595 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.903624 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.903632 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.903643 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:01 crc kubenswrapper[4807]: I1127 11:10:01.903652 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:01Z","lastTransitionTime":"2025-11-27T11:10:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.006334 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.006363 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.006371 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.006384 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.006393 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:02Z","lastTransitionTime":"2025-11-27T11:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.108861 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.108909 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.108922 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.108940 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.108959 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:02Z","lastTransitionTime":"2025-11-27T11:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.211566 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.211596 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.211606 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.211621 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.211630 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:02Z","lastTransitionTime":"2025-11-27T11:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.313726 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.313757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.313767 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.313781 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.313797 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:02Z","lastTransitionTime":"2025-11-27T11:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.415987 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.416018 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.416027 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.416039 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.416047 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:02Z","lastTransitionTime":"2025-11-27T11:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.517813 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.517848 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.517857 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.517869 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.517877 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:02Z","lastTransitionTime":"2025-11-27T11:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.531607 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:02 crc kubenswrapper[4807]: E1127 11:10:02.531724 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.532033 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:02 crc kubenswrapper[4807]: E1127 11:10:02.532109 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.532145 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:02 crc kubenswrapper[4807]: E1127 11:10:02.532195 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.532224 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:02 crc kubenswrapper[4807]: E1127 11:10:02.532286 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.620838 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.620883 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.620892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.620905 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.620914 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:02Z","lastTransitionTime":"2025-11-27T11:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.724449 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.724505 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.724514 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.724528 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.724537 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:02Z","lastTransitionTime":"2025-11-27T11:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.830877 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.830933 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.830944 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.830961 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.830974 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:02Z","lastTransitionTime":"2025-11-27T11:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.933956 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.934026 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.934044 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.934066 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:02 crc kubenswrapper[4807]: I1127 11:10:02.934083 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:02Z","lastTransitionTime":"2025-11-27T11:10:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.036896 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.036930 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.036941 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.036955 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.036965 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:03Z","lastTransitionTime":"2025-11-27T11:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.140014 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.140071 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.140093 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.140119 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.140140 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:03Z","lastTransitionTime":"2025-11-27T11:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.242937 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.242979 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.242988 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.243001 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.243010 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:03Z","lastTransitionTime":"2025-11-27T11:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.346086 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.346129 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.346140 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.346155 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.346166 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:03Z","lastTransitionTime":"2025-11-27T11:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.449006 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.449083 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.449108 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.449137 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.449159 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:03Z","lastTransitionTime":"2025-11-27T11:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.552186 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.552290 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.552314 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.552341 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.552362 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:03Z","lastTransitionTime":"2025-11-27T11:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.655848 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.655910 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.655936 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.655963 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.655981 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:03Z","lastTransitionTime":"2025-11-27T11:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.759324 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.759368 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.759379 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.759395 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.759407 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:03Z","lastTransitionTime":"2025-11-27T11:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.861690 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.861758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.861772 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.861791 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.861803 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:03Z","lastTransitionTime":"2025-11-27T11:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.964771 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.964834 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.964851 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.964876 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:03 crc kubenswrapper[4807]: I1127 11:10:03.964895 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:03Z","lastTransitionTime":"2025-11-27T11:10:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.067696 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.067729 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.067738 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.067750 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.067760 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:04Z","lastTransitionTime":"2025-11-27T11:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.100161 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.101001 4807 scope.go:117] "RemoveContainer" containerID="c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.115574 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.133717 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.150314 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.169001 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.170749 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.170784 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.170795 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.170811 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.170823 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:04Z","lastTransitionTime":"2025-11-27T11:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.179740 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.194653 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.221465 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.237385 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.251724 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/1.log" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.253854 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.254505 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.273115 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.273152 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.273161 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.273181 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.273189 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:04Z","lastTransitionTime":"2025-11-27T11:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.293576 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.307538 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.323570 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.339849 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.360364 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:50Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130557 6190 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130819 6190 obj_retry.go:551] Creating *factory.egressNode crc took: 6.095286ms\\\\nI1127 11:09:50.130841 6190 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 11:09:50.130863 6190 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 11:09:50.131300 6190 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 11:09:50.131455 6190 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 11:09:50.131502 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:09:50.131529 6190 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 11:09:50.131618 6190 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.373363 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.375885 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.375913 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.375922 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.375937 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.375946 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:04Z","lastTransitionTime":"2025-11-27T11:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.386652 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.399790 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.411868 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.428757 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.443306 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.455753 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.468002 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.478578 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.478650 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.478663 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.478683 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.478721 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:04Z","lastTransitionTime":"2025-11-27T11:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.485813 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.500911 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.514154 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.531919 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.531973 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:04 crc kubenswrapper[4807]: E1127 11:10:04.532069 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:04 crc kubenswrapper[4807]: E1127 11:10:04.532276 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.532497 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:04 crc kubenswrapper[4807]: E1127 11:10:04.532600 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.532709 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:04 crc kubenswrapper[4807]: E1127 11:10:04.532803 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.535284 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:50Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130557 6190 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130819 6190 obj_retry.go:551] Creating *factory.egressNode crc took: 6.095286ms\\\\nI1127 11:09:50.130841 6190 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 11:09:50.130863 6190 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 11:09:50.131300 6190 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 11:09:50.131455 6190 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 11:09:50.131502 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:09:50.131529 6190 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 11:09:50.131618 6190 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.552983 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.564562 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.578923 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.580398 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.580439 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.580451 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.580468 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.580479 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:04Z","lastTransitionTime":"2025-11-27T11:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.597295 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.612359 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.627448 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.647877 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.657974 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.668577 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:04Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.682336 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.682393 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.682404 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.682415 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.682424 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:04Z","lastTransitionTime":"2025-11-27T11:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.784730 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.784757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.784764 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.784776 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.784783 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:04Z","lastTransitionTime":"2025-11-27T11:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.886797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.887026 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.887113 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.887208 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.887358 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:04Z","lastTransitionTime":"2025-11-27T11:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.989553 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.989580 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.989589 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.989601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:04 crc kubenswrapper[4807]: I1127 11:10:04.989609 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:04Z","lastTransitionTime":"2025-11-27T11:10:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.092190 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.092272 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.092291 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.092314 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.092331 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.195013 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.195055 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.195067 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.195085 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.195097 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.260213 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/2.log" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.262076 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/1.log" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.266698 4807 generic.go:334] "Generic (PLEG): container finished" podID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerID="4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8" exitCode=1 Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.266771 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.266824 4807 scope.go:117] "RemoveContainer" containerID="c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.267186 4807 scope.go:117] "RemoveContainer" containerID="4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8" Nov 27 11:10:05 crc kubenswrapper[4807]: E1127 11:10:05.267336 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.288323 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.288392 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.288412 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.288441 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.288463 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.290841 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c82055b96d87fe19dcb9a6fb98ce2d4f86eff95c20e35fc0b6ea49b2b4eaf1fe\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:09:50Z\\\",\\\"message\\\":\\\"il\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130557 6190 transact.go:42] Configuring OVN: [{Op:update Table:Logical_Router_Static_Route Row:map[ip_prefix:10.217.0.0/22 nexthop:100.64.0.2 policy:{GoSet:[src-ip]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8944024f-deb7-4076-afb3-4b50a2ff4b4b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:static_routes Mutator:insert Value:{GoSet:[{GoUUID:8944024f-deb7-4076-afb3-4b50a2ff4b4b}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1127 11:09:50.130819 6190 obj_retry.go:551] Creating *factory.egressNode crc took: 6.095286ms\\\\nI1127 11:09:50.130841 6190 factory.go:1336] Added *v1.Node event handler 7\\\\nI1127 11:09:50.130863 6190 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1127 11:09:50.131300 6190 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1127 11:09:50.131455 6190 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1127 11:09:50.131502 6190 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:09:50.131529 6190 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1127 11:09:50.131618 6190 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:04Z\\\",\\\"message\\\":\\\"shift-kube-controller-manager/kube-controller-manager-crc\\\\nI1127 11:10:04.971738 6418 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:10:04.971888 6418 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k6wll\\\\nI1127 11:10:04.972487 6418 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1127 11:10:04.972521 6418 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 11:10:04.972458 6418 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1127 11:10:04.972615 6418 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: E1127 11:10:05.306384 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.310979 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.311004 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.311013 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.311028 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.311037 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.313920 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: E1127 11:10:05.326349 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.329009 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.330798 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.330834 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.330851 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.330873 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.330891 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.344977 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: E1127 11:10:05.350499 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.355027 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.355055 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.355066 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.355122 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.355133 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.359334 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: E1127 11:10:05.368149 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.372206 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.372239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.372276 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.372302 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.372328 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.373591 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: E1127 11:10:05.383128 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: E1127 11:10:05.383231 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.384563 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.384595 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.384606 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.384620 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.384632 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.391079 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.403055 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.416447 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.429047 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.441343 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.465664 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.478901 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.487334 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.487376 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.487388 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.487404 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.487416 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.489491 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.501674 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.514706 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.527124 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:05Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.590204 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.590270 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.590280 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.590293 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.590303 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.692137 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.692188 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.692201 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.692218 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.692231 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.794921 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.794960 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.794969 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.794983 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.794992 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.897708 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.897745 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.897754 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.897769 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:05 crc kubenswrapper[4807]: I1127 11:10:05.897778 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:05Z","lastTransitionTime":"2025-11-27T11:10:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.000116 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.000159 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.000168 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.000183 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.000192 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:06Z","lastTransitionTime":"2025-11-27T11:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.103280 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.103324 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.103333 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.103347 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.103356 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:06Z","lastTransitionTime":"2025-11-27T11:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.206416 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.206481 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.206501 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.206651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.206694 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:06Z","lastTransitionTime":"2025-11-27T11:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.274484 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/2.log" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.281348 4807 scope.go:117] "RemoveContainer" containerID="4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8" Nov 27 11:10:06 crc kubenswrapper[4807]: E1127 11:10:06.281681 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.301918 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.309687 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.309751 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.309775 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.309808 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.309830 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:06Z","lastTransitionTime":"2025-11-27T11:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.324376 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.337422 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.354496 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.375557 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.391889 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.410330 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.412374 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.412432 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.412447 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.412466 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.412480 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:06Z","lastTransitionTime":"2025-11-27T11:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.433881 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:04Z\\\",\\\"message\\\":\\\"shift-kube-controller-manager/kube-controller-manager-crc\\\\nI1127 11:10:04.971738 6418 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:10:04.971888 6418 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k6wll\\\\nI1127 11:10:04.972487 6418 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1127 11:10:04.972521 6418 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 11:10:04.972458 6418 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1127 11:10:04.972615 6418 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.445892 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.455055 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.472058 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.488875 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.503752 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.515105 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.515132 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.515144 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.515161 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.515171 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:06Z","lastTransitionTime":"2025-11-27T11:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.519038 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.531300 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.531362 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:06 crc kubenswrapper[4807]: E1127 11:10:06.531459 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.531470 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.531464 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:06 crc kubenswrapper[4807]: E1127 11:10:06.531557 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:06 crc kubenswrapper[4807]: E1127 11:10:06.531634 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:06 crc kubenswrapper[4807]: E1127 11:10:06.531685 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.552084 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.602133 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.615740 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:06Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.616995 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.617038 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.617051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.617071 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.617085 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:06Z","lastTransitionTime":"2025-11-27T11:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.727292 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.727353 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.727382 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.727409 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.727608 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:06Z","lastTransitionTime":"2025-11-27T11:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.830895 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.830944 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.830960 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.830978 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.830993 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:06Z","lastTransitionTime":"2025-11-27T11:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.934010 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.934451 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.934622 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.934947 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:06 crc kubenswrapper[4807]: I1127 11:10:06.935105 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:06Z","lastTransitionTime":"2025-11-27T11:10:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.038828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.039608 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.039661 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.039689 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.039707 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:07Z","lastTransitionTime":"2025-11-27T11:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.142095 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.142142 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.142150 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.142166 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.142175 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:07Z","lastTransitionTime":"2025-11-27T11:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.244814 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.244851 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.244862 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.244877 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.244887 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:07Z","lastTransitionTime":"2025-11-27T11:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.347091 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.347129 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.347137 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.347150 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.347158 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:07Z","lastTransitionTime":"2025-11-27T11:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.450523 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.450586 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.450613 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.450642 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.450665 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:07Z","lastTransitionTime":"2025-11-27T11:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.553474 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.553534 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.553547 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.553564 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.553576 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:07Z","lastTransitionTime":"2025-11-27T11:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.656310 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.656387 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.656403 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.656419 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.656432 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:07Z","lastTransitionTime":"2025-11-27T11:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.759795 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.759840 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.759851 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.759867 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.759878 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:07Z","lastTransitionTime":"2025-11-27T11:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.862461 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.862546 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.862560 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.862578 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.862590 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:07Z","lastTransitionTime":"2025-11-27T11:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.964815 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.964877 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.964928 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.964953 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:07 crc kubenswrapper[4807]: I1127 11:10:07.964972 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:07Z","lastTransitionTime":"2025-11-27T11:10:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.067769 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.067803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.067812 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.067826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.067835 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:08Z","lastTransitionTime":"2025-11-27T11:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.170454 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.170493 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.170501 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.170517 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.170525 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:08Z","lastTransitionTime":"2025-11-27T11:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.273308 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.273359 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.273378 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.273400 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.273416 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:08Z","lastTransitionTime":"2025-11-27T11:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.318031 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:08 crc kubenswrapper[4807]: E1127 11:10:08.318195 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:10:08 crc kubenswrapper[4807]: E1127 11:10:08.318686 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs podName:911bce2f-3fb2-484d-870f-d9737047bd10 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:24.318660225 +0000 UTC m=+65.418158453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs") pod "network-metrics-daemon-wszmz" (UID: "911bce2f-3fb2-484d-870f-d9737047bd10") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.376514 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.376571 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.376598 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.376625 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.376646 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:08Z","lastTransitionTime":"2025-11-27T11:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.479909 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.479949 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.479957 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.479972 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.479981 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:08Z","lastTransitionTime":"2025-11-27T11:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.531386 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.531425 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:08 crc kubenswrapper[4807]: E1127 11:10:08.531491 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.531396 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.531558 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:08 crc kubenswrapper[4807]: E1127 11:10:08.531611 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:08 crc kubenswrapper[4807]: E1127 11:10:08.531741 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:08 crc kubenswrapper[4807]: E1127 11:10:08.531848 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.583032 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.583128 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.583179 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.583206 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.583220 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:08Z","lastTransitionTime":"2025-11-27T11:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.685215 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.685510 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.685583 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.685678 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.685781 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:08Z","lastTransitionTime":"2025-11-27T11:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.787619 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.787678 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.787695 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.787719 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.787739 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:08Z","lastTransitionTime":"2025-11-27T11:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.890661 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.891014 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.891196 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.891435 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.891662 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:08Z","lastTransitionTime":"2025-11-27T11:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.996099 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.996158 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.996171 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.996187 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:08 crc kubenswrapper[4807]: I1127 11:10:08.996203 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:08Z","lastTransitionTime":"2025-11-27T11:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.098996 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.099599 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.099683 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.099757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.099832 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:09Z","lastTransitionTime":"2025-11-27T11:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.202217 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.202306 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.202316 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.202329 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.202338 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:09Z","lastTransitionTime":"2025-11-27T11:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.304357 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.304604 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.304674 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.304736 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.304797 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:09Z","lastTransitionTime":"2025-11-27T11:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.407609 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.407650 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.407666 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.407684 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.407701 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:09Z","lastTransitionTime":"2025-11-27T11:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.509781 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.509846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.509856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.509870 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.509879 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:09Z","lastTransitionTime":"2025-11-27T11:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.542910 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.554865 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.564597 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.576846 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.590007 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.601523 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.611722 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.611747 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.611755 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.611768 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.611776 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:09Z","lastTransitionTime":"2025-11-27T11:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.622367 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:04Z\\\",\\\"message\\\":\\\"shift-kube-controller-manager/kube-controller-manager-crc\\\\nI1127 11:10:04.971738 6418 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:10:04.971888 6418 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k6wll\\\\nI1127 11:10:04.972487 6418 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1127 11:10:04.972521 6418 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 11:10:04.972458 6418 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1127 11:10:04.972615 6418 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.636440 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.649207 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.657674 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.668212 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.681491 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.695525 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.714585 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.714617 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.714626 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.714638 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.714647 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:09Z","lastTransitionTime":"2025-11-27T11:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.726733 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.740970 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.755292 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.764518 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:09Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.820367 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.820980 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.821005 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.821035 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.821057 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:09Z","lastTransitionTime":"2025-11-27T11:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.922755 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.922824 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.922836 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.923113 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:09 crc kubenswrapper[4807]: I1127 11:10:09.923158 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:09Z","lastTransitionTime":"2025-11-27T11:10:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.026592 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.026648 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.026668 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.026692 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.026708 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:10Z","lastTransitionTime":"2025-11-27T11:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.129069 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.129112 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.129136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.129155 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.129169 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:10Z","lastTransitionTime":"2025-11-27T11:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.231914 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.231975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.231992 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.232013 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.232028 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:10Z","lastTransitionTime":"2025-11-27T11:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.333951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.333998 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.334013 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.334034 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.334049 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:10Z","lastTransitionTime":"2025-11-27T11:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.436771 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.436806 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.436817 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.436831 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.436841 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:10Z","lastTransitionTime":"2025-11-27T11:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.531659 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.531775 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.531673 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.531839 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.531768 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.531969 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.532037 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.532078 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.539105 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.539138 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.539148 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.539194 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.539207 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:10Z","lastTransitionTime":"2025-11-27T11:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.642507 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.642576 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.642586 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.642601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.642610 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:10Z","lastTransitionTime":"2025-11-27T11:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.744512 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.744545 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.744579 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.744592 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.744600 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:10Z","lastTransitionTime":"2025-11-27T11:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.844814 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.845153 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:10:42.845111575 +0000 UTC m=+83.944609813 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.845292 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.845516 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.845677 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.845711 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.845771 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:42.845746143 +0000 UTC m=+83.945244381 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.845821 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:42.845795314 +0000 UTC m=+83.945293552 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.846908 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.846964 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.846984 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.847008 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.847027 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:10Z","lastTransitionTime":"2025-11-27T11:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.947117 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.947185 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.947365 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.947403 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.947418 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.947456 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.947477 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:42.947457717 +0000 UTC m=+84.046956005 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.947512 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.947543 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:10:10 crc kubenswrapper[4807]: E1127 11:10:10.947648 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:42.947618511 +0000 UTC m=+84.047116749 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.949102 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.949139 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.949151 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.949168 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:10 crc kubenswrapper[4807]: I1127 11:10:10.949179 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:10Z","lastTransitionTime":"2025-11-27T11:10:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.052580 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.052645 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.052668 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.052696 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.052719 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:11Z","lastTransitionTime":"2025-11-27T11:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.091934 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.104338 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.107502 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.121795 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.140004 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.155035 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.155102 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.155114 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.155131 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.155144 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:11Z","lastTransitionTime":"2025-11-27T11:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.178922 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.196188 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.210736 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.226526 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.240182 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.254367 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.257899 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.257952 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.257964 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.257982 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.257994 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:11Z","lastTransitionTime":"2025-11-27T11:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.269607 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.281708 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.300167 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.317349 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.344683 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:04Z\\\",\\\"message\\\":\\\"shift-kube-controller-manager/kube-controller-manager-crc\\\\nI1127 11:10:04.971738 6418 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:10:04.971888 6418 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k6wll\\\\nI1127 11:10:04.972487 6418 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1127 11:10:04.972521 6418 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 11:10:04.972458 6418 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1127 11:10:04.972615 6418 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.360198 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.360239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.360270 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.360288 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.360301 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:11Z","lastTransitionTime":"2025-11-27T11:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.364683 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.378764 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.389976 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:11Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.463423 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.463473 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.463489 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.463509 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.463524 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:11Z","lastTransitionTime":"2025-11-27T11:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.565934 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.566518 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.566715 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.567068 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.567297 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:11Z","lastTransitionTime":"2025-11-27T11:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.669486 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.669546 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.669557 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.669571 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.669580 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:11Z","lastTransitionTime":"2025-11-27T11:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.772093 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.772162 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.772182 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.772207 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.772221 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:11Z","lastTransitionTime":"2025-11-27T11:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.874900 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.874951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.874980 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.874994 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.875003 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:11Z","lastTransitionTime":"2025-11-27T11:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.977573 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.977632 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.977643 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.977659 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:11 crc kubenswrapper[4807]: I1127 11:10:11.977670 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:11Z","lastTransitionTime":"2025-11-27T11:10:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.081127 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.081175 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.081190 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.081214 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.081231 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:12Z","lastTransitionTime":"2025-11-27T11:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.184688 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.184730 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.184744 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.184764 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.184779 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:12Z","lastTransitionTime":"2025-11-27T11:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.289236 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.289315 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.289327 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.289349 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.289361 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:12Z","lastTransitionTime":"2025-11-27T11:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.393087 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.393154 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.393171 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.393202 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.393217 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:12Z","lastTransitionTime":"2025-11-27T11:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.495933 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.496019 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.496036 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.496055 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.496068 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:12Z","lastTransitionTime":"2025-11-27T11:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.531512 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.531559 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.531597 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.531522 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:12 crc kubenswrapper[4807]: E1127 11:10:12.531625 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:12 crc kubenswrapper[4807]: E1127 11:10:12.531668 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:12 crc kubenswrapper[4807]: E1127 11:10:12.531728 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:12 crc kubenswrapper[4807]: E1127 11:10:12.531780 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.599087 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.599164 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.599202 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.599235 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.599332 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:12Z","lastTransitionTime":"2025-11-27T11:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.702012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.702051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.702064 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.702084 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.702094 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:12Z","lastTransitionTime":"2025-11-27T11:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.804617 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.804673 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.804696 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.804715 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.804730 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:12Z","lastTransitionTime":"2025-11-27T11:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.907423 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.907486 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.907543 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.907568 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:12 crc kubenswrapper[4807]: I1127 11:10:12.907595 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:12Z","lastTransitionTime":"2025-11-27T11:10:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.010657 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.010722 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.010745 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.010776 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.010808 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:13Z","lastTransitionTime":"2025-11-27T11:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.113775 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.113828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.113838 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.113854 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.113863 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:13Z","lastTransitionTime":"2025-11-27T11:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.216411 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.216496 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.216518 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.216545 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.216568 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:13Z","lastTransitionTime":"2025-11-27T11:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.318585 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.318638 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.318652 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.318674 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.318689 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:13Z","lastTransitionTime":"2025-11-27T11:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.421615 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.421669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.421679 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.421694 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.421703 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:13Z","lastTransitionTime":"2025-11-27T11:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.526462 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.526499 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.526511 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.526526 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.526537 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:13Z","lastTransitionTime":"2025-11-27T11:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.629362 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.629408 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.629420 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.629436 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.629448 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:13Z","lastTransitionTime":"2025-11-27T11:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.731525 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.731575 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.731584 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.731596 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.731604 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:13Z","lastTransitionTime":"2025-11-27T11:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.833607 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.833648 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.833660 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.833677 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.833687 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:13Z","lastTransitionTime":"2025-11-27T11:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.936066 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.936101 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.936112 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.936129 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:13 crc kubenswrapper[4807]: I1127 11:10:13.936140 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:13Z","lastTransitionTime":"2025-11-27T11:10:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.038352 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.038393 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.038405 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.038421 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.038433 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:14Z","lastTransitionTime":"2025-11-27T11:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.140290 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.140322 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.140334 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.140348 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.140357 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:14Z","lastTransitionTime":"2025-11-27T11:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.243061 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.243101 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.243111 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.243125 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.243136 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:14Z","lastTransitionTime":"2025-11-27T11:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.345512 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.345546 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.345555 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.345568 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.345578 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:14Z","lastTransitionTime":"2025-11-27T11:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.447546 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.447579 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.447587 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.447603 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.447611 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:14Z","lastTransitionTime":"2025-11-27T11:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.532181 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.532181 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:14 crc kubenswrapper[4807]: E1127 11:10:14.532323 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.532203 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:14 crc kubenswrapper[4807]: E1127 11:10:14.532401 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.532184 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:14 crc kubenswrapper[4807]: E1127 11:10:14.532481 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:14 crc kubenswrapper[4807]: E1127 11:10:14.532551 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.549742 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.549768 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.549777 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.549789 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.549797 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:14Z","lastTransitionTime":"2025-11-27T11:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.651587 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.651616 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.651624 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.651637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.651645 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:14Z","lastTransitionTime":"2025-11-27T11:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.754384 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.754432 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.754443 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.754465 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.754477 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:14Z","lastTransitionTime":"2025-11-27T11:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.856426 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.856461 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.856469 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.856484 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.856495 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:14Z","lastTransitionTime":"2025-11-27T11:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.958596 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.958634 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.958645 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.958658 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:14 crc kubenswrapper[4807]: I1127 11:10:14.958667 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:14Z","lastTransitionTime":"2025-11-27T11:10:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.061229 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.061303 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.061315 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.061331 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.061341 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.163765 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.163805 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.163816 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.163832 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.163842 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.268895 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.268928 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.268937 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.268951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.268961 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.371053 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.371094 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.371104 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.371118 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.371127 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.442971 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.443011 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.443020 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.443032 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.443042 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: E1127 11:10:15.454474 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:15Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.457294 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.457322 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.457331 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.457343 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.457352 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: E1127 11:10:15.466940 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:15Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.469639 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.469673 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.469684 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.469700 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.469713 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: E1127 11:10:15.480903 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:15Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.484234 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.484298 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.484315 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.484336 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.484353 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: E1127 11:10:15.495038 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:15Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.498533 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.498565 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.498576 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.498597 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.498609 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: E1127 11:10:15.512706 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:15Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:15 crc kubenswrapper[4807]: E1127 11:10:15.512817 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.514763 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.514804 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.514821 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.514842 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.514858 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.617276 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.617318 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.617329 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.617344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.617354 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.719953 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.720001 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.720014 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.720030 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.720042 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.822159 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.822197 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.822207 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.822225 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.822238 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.925035 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.925117 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.925127 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.925140 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:15 crc kubenswrapper[4807]: I1127 11:10:15.925150 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:15Z","lastTransitionTime":"2025-11-27T11:10:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.027827 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.027881 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.027900 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.027923 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.027939 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:16Z","lastTransitionTime":"2025-11-27T11:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.130689 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.130722 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.130732 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.130745 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.130754 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:16Z","lastTransitionTime":"2025-11-27T11:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.233162 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.233218 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.233233 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.233322 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.233343 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:16Z","lastTransitionTime":"2025-11-27T11:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.335919 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.335999 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.336012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.336029 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.336038 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:16Z","lastTransitionTime":"2025-11-27T11:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.438593 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.438622 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.438634 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.438651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.438664 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:16Z","lastTransitionTime":"2025-11-27T11:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.531710 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.531750 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.531770 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.531824 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:16 crc kubenswrapper[4807]: E1127 11:10:16.531823 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:16 crc kubenswrapper[4807]: E1127 11:10:16.531885 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:16 crc kubenswrapper[4807]: E1127 11:10:16.532024 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:16 crc kubenswrapper[4807]: E1127 11:10:16.532205 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.541145 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.541176 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.541188 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.541203 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.541216 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:16Z","lastTransitionTime":"2025-11-27T11:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.643782 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.643827 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.643850 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.643874 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.643888 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:16Z","lastTransitionTime":"2025-11-27T11:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.750411 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.750455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.750466 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.750482 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.750490 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:16Z","lastTransitionTime":"2025-11-27T11:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.852731 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.852764 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.852772 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.852784 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.852792 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:16Z","lastTransitionTime":"2025-11-27T11:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.954964 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.954997 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.955006 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.955018 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:16 crc kubenswrapper[4807]: I1127 11:10:16.955028 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:16Z","lastTransitionTime":"2025-11-27T11:10:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.057207 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.057288 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.057305 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.057325 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.057343 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:17Z","lastTransitionTime":"2025-11-27T11:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.159606 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.159659 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.159678 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.159700 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.159719 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:17Z","lastTransitionTime":"2025-11-27T11:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.262360 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.262390 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.262398 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.262411 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.262419 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:17Z","lastTransitionTime":"2025-11-27T11:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.365167 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.365214 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.365230 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.365283 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.365302 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:17Z","lastTransitionTime":"2025-11-27T11:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.468139 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.468190 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.468201 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.468228 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.468240 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:17Z","lastTransitionTime":"2025-11-27T11:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.570421 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.570470 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.570480 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.570532 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.570553 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:17Z","lastTransitionTime":"2025-11-27T11:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.673067 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.673121 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.673132 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.673147 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.673156 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:17Z","lastTransitionTime":"2025-11-27T11:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.775156 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.775201 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.775216 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.775235 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.775273 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:17Z","lastTransitionTime":"2025-11-27T11:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.877799 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.877845 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.877856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.877873 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.877884 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:17Z","lastTransitionTime":"2025-11-27T11:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.980714 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.980816 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.981709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.982012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:17 crc kubenswrapper[4807]: I1127 11:10:17.982400 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:17Z","lastTransitionTime":"2025-11-27T11:10:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.084138 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.084202 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.084230 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.084302 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.084328 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:18Z","lastTransitionTime":"2025-11-27T11:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.186485 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.186527 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.186538 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.186553 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.186564 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:18Z","lastTransitionTime":"2025-11-27T11:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.288851 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.288914 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.288929 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.288951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.288966 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:18Z","lastTransitionTime":"2025-11-27T11:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.391649 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.391703 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.391718 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.391738 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.391752 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:18Z","lastTransitionTime":"2025-11-27T11:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.494766 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.494824 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.494836 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.494854 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.494866 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:18Z","lastTransitionTime":"2025-11-27T11:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.531597 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.531629 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.531641 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:18 crc kubenswrapper[4807]: E1127 11:10:18.531701 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.531597 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:18 crc kubenswrapper[4807]: E1127 11:10:18.531782 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:18 crc kubenswrapper[4807]: E1127 11:10:18.531876 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:18 crc kubenswrapper[4807]: E1127 11:10:18.531937 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.596892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.596939 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.596951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.596969 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.597010 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:18Z","lastTransitionTime":"2025-11-27T11:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.698710 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.698753 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.698761 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.698774 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.698783 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:18Z","lastTransitionTime":"2025-11-27T11:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.800958 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.801013 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.801030 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.801053 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.801066 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:18Z","lastTransitionTime":"2025-11-27T11:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.903742 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.903797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.903808 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.903826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:18 crc kubenswrapper[4807]: I1127 11:10:18.903838 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:18Z","lastTransitionTime":"2025-11-27T11:10:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.006488 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.006535 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.006547 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.006565 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.006577 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:19Z","lastTransitionTime":"2025-11-27T11:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.108661 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.108734 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.108758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.108786 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.108807 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:19Z","lastTransitionTime":"2025-11-27T11:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.211942 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.211978 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.211989 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.212006 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.212020 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:19Z","lastTransitionTime":"2025-11-27T11:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.314113 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.314146 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.314161 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.314186 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.314198 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:19Z","lastTransitionTime":"2025-11-27T11:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.416761 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.416797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.416808 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.416823 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.416835 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:19Z","lastTransitionTime":"2025-11-27T11:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.519441 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.519476 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.519486 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.519501 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.519510 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:19Z","lastTransitionTime":"2025-11-27T11:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.550353 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.562828 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.579620 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:04Z\\\",\\\"message\\\":\\\"shift-kube-controller-manager/kube-controller-manager-crc\\\\nI1127 11:10:04.971738 6418 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:10:04.971888 6418 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k6wll\\\\nI1127 11:10:04.972487 6418 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1127 11:10:04.972521 6418 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 11:10:04.972458 6418 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1127 11:10:04.972615 6418 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.592129 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.603260 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.611604 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.620346 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.621803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.621856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.621882 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.621899 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.621910 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:19Z","lastTransitionTime":"2025-11-27T11:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.631109 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.640896 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d840667c-8233-4eb0-9789-d41a8fb11fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d33545207add221fa61a7c8259b245fa2f114f53ef101d74503d4bbadad20fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a8b1a2034b437496a624127bf754d60abab11457ad19e6e074fe454d0e21b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284dd629e6c81f1232a2d56007a2a7471423b7a601c38bcd3bc264ab9586fc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.656307 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.667605 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.680398 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.690163 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.701503 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.711692 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.722737 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.724494 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.724521 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.724530 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.724543 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.724551 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:19Z","lastTransitionTime":"2025-11-27T11:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.731605 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.741356 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:19Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.825909 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.825938 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.825946 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.825959 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.825967 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:19Z","lastTransitionTime":"2025-11-27T11:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.927857 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.927910 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.927923 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.927938 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:19 crc kubenswrapper[4807]: I1127 11:10:19.927946 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:19Z","lastTransitionTime":"2025-11-27T11:10:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.030330 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.030372 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.030384 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.030418 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.030431 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:20Z","lastTransitionTime":"2025-11-27T11:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.132125 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.132348 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.132447 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.132519 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.132584 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:20Z","lastTransitionTime":"2025-11-27T11:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.234709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.234742 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.234753 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.234769 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.234780 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:20Z","lastTransitionTime":"2025-11-27T11:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.336801 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.336833 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.336844 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.336859 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.336869 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:20Z","lastTransitionTime":"2025-11-27T11:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.442022 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.442065 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.442075 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.442094 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.442106 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:20Z","lastTransitionTime":"2025-11-27T11:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.531229 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.531293 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.531309 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:20 crc kubenswrapper[4807]: E1127 11:10:20.531373 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.531447 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:20 crc kubenswrapper[4807]: E1127 11:10:20.531511 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:20 crc kubenswrapper[4807]: E1127 11:10:20.531618 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:20 crc kubenswrapper[4807]: E1127 11:10:20.531721 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.544708 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.544747 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.544762 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.544784 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.544800 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:20Z","lastTransitionTime":"2025-11-27T11:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.647970 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.648027 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.648045 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.648070 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.648089 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:20Z","lastTransitionTime":"2025-11-27T11:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.750879 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.750927 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.750939 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.750957 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.750974 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:20Z","lastTransitionTime":"2025-11-27T11:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.853148 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.853193 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.853209 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.853230 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.853268 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:20Z","lastTransitionTime":"2025-11-27T11:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.955950 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.956012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.956026 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.956047 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:20 crc kubenswrapper[4807]: I1127 11:10:20.956064 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:20Z","lastTransitionTime":"2025-11-27T11:10:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.058764 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.058805 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.058813 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.058827 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.058838 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:21Z","lastTransitionTime":"2025-11-27T11:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.161970 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.162019 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.162032 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.162051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.162063 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:21Z","lastTransitionTime":"2025-11-27T11:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.264776 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.264815 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.264825 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.264843 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.264855 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:21Z","lastTransitionTime":"2025-11-27T11:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.367557 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.367599 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.367615 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.367634 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.367649 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:21Z","lastTransitionTime":"2025-11-27T11:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.469807 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.469867 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.469890 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.469918 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.469939 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:21Z","lastTransitionTime":"2025-11-27T11:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.533739 4807 scope.go:117] "RemoveContainer" containerID="4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8" Nov 27 11:10:21 crc kubenswrapper[4807]: E1127 11:10:21.533998 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.572702 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.572741 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.572758 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.572775 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.572788 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:21Z","lastTransitionTime":"2025-11-27T11:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.675087 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.675136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.675149 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.675167 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.675180 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:21Z","lastTransitionTime":"2025-11-27T11:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.777748 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.777789 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.777799 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.777815 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.777827 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:21Z","lastTransitionTime":"2025-11-27T11:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.880344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.880393 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.880402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.880417 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.880426 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:21Z","lastTransitionTime":"2025-11-27T11:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.982503 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.982542 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.982554 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.982570 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:21 crc kubenswrapper[4807]: I1127 11:10:21.982597 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:21Z","lastTransitionTime":"2025-11-27T11:10:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.084591 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.084633 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.084643 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.084657 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.084668 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:22Z","lastTransitionTime":"2025-11-27T11:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.186751 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.186816 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.186830 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.186849 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.186861 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:22Z","lastTransitionTime":"2025-11-27T11:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.289300 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.289342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.289351 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.289365 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.289376 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:22Z","lastTransitionTime":"2025-11-27T11:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.391261 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.391296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.391307 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.391319 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.391328 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:22Z","lastTransitionTime":"2025-11-27T11:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.494067 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.494100 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.494114 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.494130 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.494141 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:22Z","lastTransitionTime":"2025-11-27T11:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.531399 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:22 crc kubenswrapper[4807]: E1127 11:10:22.531740 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.531469 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:22 crc kubenswrapper[4807]: E1127 11:10:22.531948 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.531454 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:22 crc kubenswrapper[4807]: E1127 11:10:22.532118 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.531483 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:22 crc kubenswrapper[4807]: E1127 11:10:22.532312 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.596450 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.596734 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.596816 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.596890 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.596956 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:22Z","lastTransitionTime":"2025-11-27T11:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.699509 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.699732 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.699767 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.699787 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.699798 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:22Z","lastTransitionTime":"2025-11-27T11:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.801627 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.801676 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.801701 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.801722 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.801736 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:22Z","lastTransitionTime":"2025-11-27T11:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.904108 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.904157 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.904176 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.904192 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:22 crc kubenswrapper[4807]: I1127 11:10:22.904203 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:22Z","lastTransitionTime":"2025-11-27T11:10:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.006732 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.006756 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.006763 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.006774 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.006782 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:23Z","lastTransitionTime":"2025-11-27T11:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.108995 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.109020 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.109028 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.109039 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.109048 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:23Z","lastTransitionTime":"2025-11-27T11:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.211238 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.211302 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.211316 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.211333 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.211345 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:23Z","lastTransitionTime":"2025-11-27T11:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.313523 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.313592 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.313604 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.313669 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.313682 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:23Z","lastTransitionTime":"2025-11-27T11:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.416315 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.416427 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.416447 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.416474 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.416492 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:23Z","lastTransitionTime":"2025-11-27T11:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.518958 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.519002 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.519014 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.519030 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.519042 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:23Z","lastTransitionTime":"2025-11-27T11:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.621282 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.621323 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.621336 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.621350 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.621360 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:23Z","lastTransitionTime":"2025-11-27T11:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.723732 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.723773 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.723785 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.723800 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.723812 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:23Z","lastTransitionTime":"2025-11-27T11:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.825985 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.826053 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.826075 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.826107 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.826129 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:23Z","lastTransitionTime":"2025-11-27T11:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.928284 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.928330 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.928342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.928360 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:23 crc kubenswrapper[4807]: I1127 11:10:23.928370 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:23Z","lastTransitionTime":"2025-11-27T11:10:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.030797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.030856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.030867 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.030882 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.030893 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:24Z","lastTransitionTime":"2025-11-27T11:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.133738 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.133824 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.133834 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.133848 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.133858 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:24Z","lastTransitionTime":"2025-11-27T11:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.236032 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.236065 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.236076 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.236095 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.236107 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:24Z","lastTransitionTime":"2025-11-27T11:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.338189 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.338233 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.338267 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.338288 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.338300 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:24Z","lastTransitionTime":"2025-11-27T11:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.386062 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:24 crc kubenswrapper[4807]: E1127 11:10:24.386280 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:10:24 crc kubenswrapper[4807]: E1127 11:10:24.386371 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs podName:911bce2f-3fb2-484d-870f-d9737047bd10 nodeName:}" failed. No retries permitted until 2025-11-27 11:10:56.386353607 +0000 UTC m=+97.485851805 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs") pod "network-metrics-daemon-wszmz" (UID: "911bce2f-3fb2-484d-870f-d9737047bd10") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.440760 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.440803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.440814 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.440831 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.440844 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:24Z","lastTransitionTime":"2025-11-27T11:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.531421 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.531466 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.531466 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.531424 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:24 crc kubenswrapper[4807]: E1127 11:10:24.531550 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:24 crc kubenswrapper[4807]: E1127 11:10:24.531637 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:24 crc kubenswrapper[4807]: E1127 11:10:24.531700 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:24 crc kubenswrapper[4807]: E1127 11:10:24.531748 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.542686 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.542712 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.542723 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.542736 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.542746 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:24Z","lastTransitionTime":"2025-11-27T11:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.644609 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.644641 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.644650 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.644668 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.644678 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:24Z","lastTransitionTime":"2025-11-27T11:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.746614 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.746653 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.746666 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.746682 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.746692 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:24Z","lastTransitionTime":"2025-11-27T11:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.848623 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.848691 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.848714 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.848738 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.848756 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:24Z","lastTransitionTime":"2025-11-27T11:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.950702 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.950745 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.950756 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.950771 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:24 crc kubenswrapper[4807]: I1127 11:10:24.950779 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:24Z","lastTransitionTime":"2025-11-27T11:10:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.052901 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.052946 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.052958 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.052976 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.052988 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.155694 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.155729 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.155756 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.155769 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.155779 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.258162 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.258215 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.258233 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.258293 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.258315 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.360406 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.360442 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.360453 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.360502 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.360515 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.462931 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.463177 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.463264 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.463522 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.463590 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.566010 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.566052 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.566062 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.566079 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.566089 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.667865 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.667890 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.667897 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.667909 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.667918 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.804121 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.804151 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.804162 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.804176 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.804186 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.887189 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.887481 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.887603 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.887753 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.887849 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: E1127 11:10:25.898985 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:25Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.901984 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.902012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.902022 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.902037 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.902047 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: E1127 11:10:25.911510 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:25Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.914183 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.914216 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.914227 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.914260 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.914275 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: E1127 11:10:25.924489 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:25Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.927233 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.927304 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.927316 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.927336 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.927347 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: E1127 11:10:25.938160 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:25Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.940949 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.940986 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.940998 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.941014 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.941026 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:25 crc kubenswrapper[4807]: E1127 11:10:25.951037 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:25Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:25 crc kubenswrapper[4807]: E1127 11:10:25.951197 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.952544 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.952578 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.952587 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.952601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:25 crc kubenswrapper[4807]: I1127 11:10:25.952611 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:25Z","lastTransitionTime":"2025-11-27T11:10:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.054748 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.054829 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.054846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.054868 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.054886 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:26Z","lastTransitionTime":"2025-11-27T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.156857 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.156892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.156902 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.156915 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.156925 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:26Z","lastTransitionTime":"2025-11-27T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.258703 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.258742 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.258752 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.258768 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.258777 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:26Z","lastTransitionTime":"2025-11-27T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.360995 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.361319 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.361327 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.361345 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.361354 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:26Z","lastTransitionTime":"2025-11-27T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.463693 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.463733 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.463742 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.463757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.463766 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:26Z","lastTransitionTime":"2025-11-27T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.531592 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:26 crc kubenswrapper[4807]: E1127 11:10:26.531789 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.532681 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:26 crc kubenswrapper[4807]: E1127 11:10:26.532750 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.532801 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:26 crc kubenswrapper[4807]: E1127 11:10:26.532854 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.532880 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:26 crc kubenswrapper[4807]: E1127 11:10:26.532929 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.544002 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.566850 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.566890 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.566901 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.566917 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.566928 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:26Z","lastTransitionTime":"2025-11-27T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.669096 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.669131 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.669140 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.669154 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.669163 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:26Z","lastTransitionTime":"2025-11-27T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.771508 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.771553 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.771563 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.771615 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.771629 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:26Z","lastTransitionTime":"2025-11-27T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.880299 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.880349 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.880361 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.880379 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.880391 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:26Z","lastTransitionTime":"2025-11-27T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.982958 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.982999 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.983008 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.983022 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:26 crc kubenswrapper[4807]: I1127 11:10:26.983031 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:26Z","lastTransitionTime":"2025-11-27T11:10:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.085864 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.085904 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.085913 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.085945 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.085955 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:27Z","lastTransitionTime":"2025-11-27T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.188206 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.188240 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.188265 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.188283 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.188294 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:27Z","lastTransitionTime":"2025-11-27T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.956801 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.956829 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.956837 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.956849 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.956858 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:27Z","lastTransitionTime":"2025-11-27T11:10:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.959378 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmngf_97f15cbb-220e-47db-b418-3a5aa4eb55a2/kube-multus/0.log" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.959418 4807 generic.go:334] "Generic (PLEG): container finished" podID="97f15cbb-220e-47db-b418-3a5aa4eb55a2" containerID="396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64" exitCode=1 Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.959518 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.959582 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.959665 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.959666 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:27 crc kubenswrapper[4807]: E1127 11:10:27.959838 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.959971 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmngf" event={"ID":"97f15cbb-220e-47db-b418-3a5aa4eb55a2","Type":"ContainerDied","Data":"396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64"} Nov 27 11:10:27 crc kubenswrapper[4807]: E1127 11:10:27.960069 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.960264 4807 scope.go:117] "RemoveContainer" containerID="396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64" Nov 27 11:10:27 crc kubenswrapper[4807]: E1127 11:10:27.960271 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:27 crc kubenswrapper[4807]: E1127 11:10:27.960563 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:27 crc kubenswrapper[4807]: I1127 11:10:27.981899 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:27Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.001412 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:27Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.014483 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.031954 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.048652 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.058888 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.058929 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.058945 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.058967 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.058983 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:28Z","lastTransitionTime":"2025-11-27T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.068517 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.080664 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb849-c866-48ca-a9fb-5f64ce2b6851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5855d42391ccbd3f32e7abf944c071e0912ec43fa3137269b842b95e6907b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.097488 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:26Z\\\",\\\"message\\\":\\\"2025-11-27T11:09:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e\\\\n2025-11-27T11:09:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e to /host/opt/cni/bin/\\\\n2025-11-27T11:09:41Z [verbose] multus-daemon started\\\\n2025-11-27T11:09:41Z [verbose] Readiness Indicator file check\\\\n2025-11-27T11:10:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.123923 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:04Z\\\",\\\"message\\\":\\\"shift-kube-controller-manager/kube-controller-manager-crc\\\\nI1127 11:10:04.971738 6418 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:10:04.971888 6418 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k6wll\\\\nI1127 11:10:04.972487 6418 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1127 11:10:04.972521 6418 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 11:10:04.972458 6418 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1127 11:10:04.972615 6418 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.138829 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.149222 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.159356 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.160602 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.160641 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.160659 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.160681 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.160694 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:28Z","lastTransitionTime":"2025-11-27T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.169694 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.179276 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.190493 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.202438 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d840667c-8233-4eb0-9789-d41a8fb11fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d33545207add221fa61a7c8259b245fa2f114f53ef101d74503d4bbadad20fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a8b1a2034b437496a624127bf754d60abab11457ad19e6e074fe454d0e21b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284dd629e6c81f1232a2d56007a2a7471423b7a601c38bcd3bc264ab9586fc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.254730 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.263122 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.263166 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.263176 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.263190 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.263200 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:28Z","lastTransitionTime":"2025-11-27T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.280115 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.295056 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.364877 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.364925 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.364934 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.364949 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.364958 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:28Z","lastTransitionTime":"2025-11-27T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.467467 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.467509 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.467517 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.467532 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.467541 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:28Z","lastTransitionTime":"2025-11-27T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.569862 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.569909 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.569917 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.569932 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.569941 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:28Z","lastTransitionTime":"2025-11-27T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.671847 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.671886 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.671895 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.671907 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.671915 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:28Z","lastTransitionTime":"2025-11-27T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.773775 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.773803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.773813 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.773825 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.773834 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:28Z","lastTransitionTime":"2025-11-27T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.875801 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.875830 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.875838 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.875850 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.875859 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:28Z","lastTransitionTime":"2025-11-27T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.963938 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmngf_97f15cbb-220e-47db-b418-3a5aa4eb55a2/kube-multus/0.log" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.964001 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmngf" event={"ID":"97f15cbb-220e-47db-b418-3a5aa4eb55a2","Type":"ContainerStarted","Data":"a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.977671 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.977697 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.977706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.977717 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.977725 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:28Z","lastTransitionTime":"2025-11-27T11:10:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.984522 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:28 crc kubenswrapper[4807]: I1127 11:10:28.995918 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb849-c866-48ca-a9fb-5f64ce2b6851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5855d42391ccbd3f32e7abf944c071e0912ec43fa3137269b842b95e6907b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:28Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.008387 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:26Z\\\",\\\"message\\\":\\\"2025-11-27T11:09:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e\\\\n2025-11-27T11:09:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e to /host/opt/cni/bin/\\\\n2025-11-27T11:09:41Z [verbose] multus-daemon started\\\\n2025-11-27T11:09:41Z [verbose] Readiness Indicator file check\\\\n2025-11-27T11:10:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.024054 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:04Z\\\",\\\"message\\\":\\\"shift-kube-controller-manager/kube-controller-manager-crc\\\\nI1127 11:10:04.971738 6418 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:10:04.971888 6418 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k6wll\\\\nI1127 11:10:04.972487 6418 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1127 11:10:04.972521 6418 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 11:10:04.972458 6418 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1127 11:10:04.972615 6418 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.043633 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.056753 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.068357 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.078069 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.079382 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.079404 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.079413 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.079425 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.079434 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:29Z","lastTransitionTime":"2025-11-27T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.090031 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d840667c-8233-4eb0-9789-d41a8fb11fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d33545207add221fa61a7c8259b245fa2f114f53ef101d74503d4bbadad20fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a8b1a2034b437496a624127bf754d60abab11457ad19e6e074fe454d0e21b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284dd629e6c81f1232a2d56007a2a7471423b7a601c38bcd3bc264ab9586fc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.111557 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.121875 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.135198 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.146596 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.158300 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.168743 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.180997 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.181800 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.181858 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.181871 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.181891 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.181904 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:29Z","lastTransitionTime":"2025-11-27T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.194454 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.204292 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.213792 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.283535 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.283569 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.283579 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.283593 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.283603 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:29Z","lastTransitionTime":"2025-11-27T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.385409 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.385450 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.385459 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.385473 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.385484 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:29Z","lastTransitionTime":"2025-11-27T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.487814 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.487862 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.487878 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.487906 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.487924 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:29Z","lastTransitionTime":"2025-11-27T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.532217 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:29 crc kubenswrapper[4807]: E1127 11:10:29.532385 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.532217 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.532438 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:29 crc kubenswrapper[4807]: E1127 11:10:29.532469 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:29 crc kubenswrapper[4807]: E1127 11:10:29.532545 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.533771 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:29 crc kubenswrapper[4807]: E1127 11:10:29.534989 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.550542 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.564211 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.574000 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.582831 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.590004 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.590029 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.590039 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.590051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.590066 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:29Z","lastTransitionTime":"2025-11-27T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.593559 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.602505 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb849-c866-48ca-a9fb-5f64ce2b6851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5855d42391ccbd3f32e7abf944c071e0912ec43fa3137269b842b95e6907b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.615162 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:26Z\\\",\\\"message\\\":\\\"2025-11-27T11:09:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e\\\\n2025-11-27T11:09:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e to /host/opt/cni/bin/\\\\n2025-11-27T11:09:41Z [verbose] multus-daemon started\\\\n2025-11-27T11:09:41Z [verbose] Readiness Indicator file check\\\\n2025-11-27T11:10:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.638377 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:04Z\\\",\\\"message\\\":\\\"shift-kube-controller-manager/kube-controller-manager-crc\\\\nI1127 11:10:04.971738 6418 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:10:04.971888 6418 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k6wll\\\\nI1127 11:10:04.972487 6418 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1127 11:10:04.972521 6418 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 11:10:04.972458 6418 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1127 11:10:04.972615 6418 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.654270 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.666394 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.674709 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.683939 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d840667c-8233-4eb0-9789-d41a8fb11fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d33545207add221fa61a7c8259b245fa2f114f53ef101d74503d4bbadad20fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a8b1a2034b437496a624127bf754d60abab11457ad19e6e074fe454d0e21b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284dd629e6c81f1232a2d56007a2a7471423b7a601c38bcd3bc264ab9586fc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.698003 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.698026 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.698034 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.698045 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.698054 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:29Z","lastTransitionTime":"2025-11-27T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.708041 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.719821 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.732498 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.745397 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.757114 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.767286 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.777645 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:29Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.800468 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.800494 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.800503 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.800515 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.800524 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:29Z","lastTransitionTime":"2025-11-27T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.902705 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.902751 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.902765 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.902782 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:29 crc kubenswrapper[4807]: I1127 11:10:29.902792 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:29Z","lastTransitionTime":"2025-11-27T11:10:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.005687 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.005737 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.005748 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.005765 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.005776 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:30Z","lastTransitionTime":"2025-11-27T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.107804 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.107844 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.107855 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.107870 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.107882 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:30Z","lastTransitionTime":"2025-11-27T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.210008 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.210047 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.210058 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.210071 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.210080 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:30Z","lastTransitionTime":"2025-11-27T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.311822 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.311855 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.311864 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.311876 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.311885 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:30Z","lastTransitionTime":"2025-11-27T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.414519 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.414564 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.414578 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.414598 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.414612 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:30Z","lastTransitionTime":"2025-11-27T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.517062 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.517142 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.517163 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.517186 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.517201 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:30Z","lastTransitionTime":"2025-11-27T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.619119 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.619168 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.619183 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.619204 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.619220 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:30Z","lastTransitionTime":"2025-11-27T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.754707 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.754755 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.754767 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.754785 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.754797 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:30Z","lastTransitionTime":"2025-11-27T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.856951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.857013 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.857027 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.857044 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.857055 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:30Z","lastTransitionTime":"2025-11-27T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.958912 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.958942 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.958952 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.958964 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:30 crc kubenswrapper[4807]: I1127 11:10:30.958972 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:30Z","lastTransitionTime":"2025-11-27T11:10:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.060635 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.060667 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.060676 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.060689 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.060698 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:31Z","lastTransitionTime":"2025-11-27T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.162875 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.162914 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.162923 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.162937 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.162946 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:31Z","lastTransitionTime":"2025-11-27T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.265329 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.265378 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.265389 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.265408 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.265423 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:31Z","lastTransitionTime":"2025-11-27T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.367274 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.367313 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.367321 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.367334 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.367345 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:31Z","lastTransitionTime":"2025-11-27T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.469626 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.469665 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.469677 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.469694 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.469706 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:31Z","lastTransitionTime":"2025-11-27T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.532323 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.532420 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:31 crc kubenswrapper[4807]: E1127 11:10:31.532510 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.532544 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.532606 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:31 crc kubenswrapper[4807]: E1127 11:10:31.532629 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:31 crc kubenswrapper[4807]: E1127 11:10:31.532719 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:31 crc kubenswrapper[4807]: E1127 11:10:31.532782 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.572531 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.572577 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.572586 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.572601 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.572611 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:31Z","lastTransitionTime":"2025-11-27T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.675064 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.675107 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.675118 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.675135 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.675147 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:31Z","lastTransitionTime":"2025-11-27T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.777729 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.777768 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.777779 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.777795 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.777810 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:31Z","lastTransitionTime":"2025-11-27T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.879729 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.879765 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.879774 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.879789 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.879800 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:31Z","lastTransitionTime":"2025-11-27T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.981300 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.981350 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.981361 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.981378 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:31 crc kubenswrapper[4807]: I1127 11:10:31.981389 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:31Z","lastTransitionTime":"2025-11-27T11:10:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.084512 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.084588 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.084613 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.084638 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.084659 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:32Z","lastTransitionTime":"2025-11-27T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.186892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.186930 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.186949 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.186967 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.186976 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:32Z","lastTransitionTime":"2025-11-27T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.289828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.289890 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.289907 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.290319 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.290374 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:32Z","lastTransitionTime":"2025-11-27T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.393575 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.393623 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.393635 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.393653 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.393665 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:32Z","lastTransitionTime":"2025-11-27T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.496139 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.496170 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.496181 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.496193 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.496202 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:32Z","lastTransitionTime":"2025-11-27T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.532534 4807 scope.go:117] "RemoveContainer" containerID="4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.599042 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.599092 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.599102 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.599118 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.599129 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:32Z","lastTransitionTime":"2025-11-27T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.701479 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.701523 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.701538 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.701563 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.701580 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:32Z","lastTransitionTime":"2025-11-27T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.803328 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.803373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.803385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.803402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.803414 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:32Z","lastTransitionTime":"2025-11-27T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.905574 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.905616 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.905625 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.905637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.905647 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:32Z","lastTransitionTime":"2025-11-27T11:10:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.976101 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/2.log" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.978714 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9"} Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.979038 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.989843 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:32Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:32 crc kubenswrapper[4807]: I1127 11:10:32.999072 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:32Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.007786 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.007825 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.007837 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.007853 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.007864 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:33Z","lastTransitionTime":"2025-11-27T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.008945 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d840667c-8233-4eb0-9789-d41a8fb11fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d33545207add221fa61a7c8259b245fa2f114f53ef101d74503d4bbadad20fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a8b1a2034b437496a624127bf754d60abab11457ad19e6e074fe454d0e21b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284dd629e6c81f1232a2d56007a2a7471423b7a601c38bcd3bc264ab9586fc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.029181 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.040638 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.052689 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.064352 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.081038 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.090778 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.100964 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.109625 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.109654 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.109662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.109679 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.109689 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:33Z","lastTransitionTime":"2025-11-27T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.114584 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.128278 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.140306 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.151996 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.170234 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.181903 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb849-c866-48ca-a9fb-5f64ce2b6851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5855d42391ccbd3f32e7abf944c071e0912ec43fa3137269b842b95e6907b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.194972 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:26Z\\\",\\\"message\\\":\\\"2025-11-27T11:09:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e\\\\n2025-11-27T11:09:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e to /host/opt/cni/bin/\\\\n2025-11-27T11:09:41Z [verbose] multus-daemon started\\\\n2025-11-27T11:09:41Z [verbose] Readiness Indicator file check\\\\n2025-11-27T11:10:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.217848 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.217892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.217903 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.217916 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.217927 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:33Z","lastTransitionTime":"2025-11-27T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.219640 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:04Z\\\",\\\"message\\\":\\\"shift-kube-controller-manager/kube-controller-manager-crc\\\\nI1127 11:10:04.971738 6418 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:10:04.971888 6418 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k6wll\\\\nI1127 11:10:04.972487 6418 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1127 11:10:04.972521 6418 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 11:10:04.972458 6418 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1127 11:10:04.972615 6418 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.234380 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.320964 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.321009 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.321024 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.321044 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.321060 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:33Z","lastTransitionTime":"2025-11-27T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.425723 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.425773 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.425790 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.425814 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.425832 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:33Z","lastTransitionTime":"2025-11-27T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.528821 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.528869 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.528886 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.528910 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.528930 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:33Z","lastTransitionTime":"2025-11-27T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.536468 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:33 crc kubenswrapper[4807]: E1127 11:10:33.536628 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.536863 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:33 crc kubenswrapper[4807]: E1127 11:10:33.536980 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.537308 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:33 crc kubenswrapper[4807]: E1127 11:10:33.537431 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.537687 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:33 crc kubenswrapper[4807]: E1127 11:10:33.537791 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.631577 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.631795 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.631823 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.631854 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.631876 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:33Z","lastTransitionTime":"2025-11-27T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.734080 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.734116 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.734151 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.734168 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.734180 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:33Z","lastTransitionTime":"2025-11-27T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.838018 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.838079 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.838097 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.838125 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.838142 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:33Z","lastTransitionTime":"2025-11-27T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.940863 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.940919 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.940935 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.940957 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.940974 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:33Z","lastTransitionTime":"2025-11-27T11:10:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.985504 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/3.log" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.986662 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/2.log" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.991881 4807 generic.go:334] "Generic (PLEG): container finished" podID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerID="a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9" exitCode=1 Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.992011 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9"} Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.992095 4807 scope.go:117] "RemoveContainer" containerID="4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8" Nov 27 11:10:33 crc kubenswrapper[4807]: I1127 11:10:33.992845 4807 scope.go:117] "RemoveContainer" containerID="a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9" Nov 27 11:10:33 crc kubenswrapper[4807]: E1127 11:10:33.993087 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.020897 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.043048 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.045221 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.045304 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.045321 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.045344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.045362 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:34Z","lastTransitionTime":"2025-11-27T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.060367 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb849-c866-48ca-a9fb-5f64ce2b6851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5855d42391ccbd3f32e7abf944c071e0912ec43fa3137269b842b95e6907b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.080802 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:26Z\\\",\\\"message\\\":\\\"2025-11-27T11:09:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e\\\\n2025-11-27T11:09:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e to /host/opt/cni/bin/\\\\n2025-11-27T11:09:41Z [verbose] multus-daemon started\\\\n2025-11-27T11:09:41Z [verbose] Readiness Indicator file check\\\\n2025-11-27T11:10:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.112543 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e797ccafa4cc73baf5a9e7162b79a4e00c52722968fcba38c41d5350f9a75f8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:04Z\\\",\\\"message\\\":\\\"shift-kube-controller-manager/kube-controller-manager-crc\\\\nI1127 11:10:04.971738 6418 ovnkube.go:599] Stopped ovnkube\\\\nI1127 11:10:04.971888 6418 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-k6wll\\\\nI1127 11:10:04.972487 6418 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1127 11:10:04.972521 6418 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1127 11:10:04.972458 6418 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1127 11:10:04.972615 6418 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:33Z\\\",\\\"message\\\":\\\"rding success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1127 11:10:33.295482 6781 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-wszmz\\\\nI1127 11:10:33.295490 6781 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1127 11:10:33.295491 6781 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z]\\\\nI1127 11:10:33.295497 6781 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1127 11:10:33.2955\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.129321 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.144563 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.147338 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.147381 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.147400 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.147423 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.147440 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:34Z","lastTransitionTime":"2025-11-27T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.163821 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.181394 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.192971 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.205960 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.218682 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d840667c-8233-4eb0-9789-d41a8fb11fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d33545207add221fa61a7c8259b245fa2f114f53ef101d74503d4bbadad20fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a8b1a2034b437496a624127bf754d60abab11457ad19e6e074fe454d0e21b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284dd629e6c81f1232a2d56007a2a7471423b7a601c38bcd3bc264ab9586fc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.248191 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.250018 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.250043 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.250051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.250065 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.250077 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:34Z","lastTransitionTime":"2025-11-27T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.264206 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.279343 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.293188 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.314437 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.329535 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.341161 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:34Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.353734 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.353791 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.353802 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.353819 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.353833 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:34Z","lastTransitionTime":"2025-11-27T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.456403 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.456464 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.456476 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.456494 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.456507 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:34Z","lastTransitionTime":"2025-11-27T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.559715 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.559802 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.559823 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.559854 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.559879 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:34Z","lastTransitionTime":"2025-11-27T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.663193 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.663264 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.663277 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.663295 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.663307 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:34Z","lastTransitionTime":"2025-11-27T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.766346 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.766404 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.766421 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.766444 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.766460 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:34Z","lastTransitionTime":"2025-11-27T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.869651 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.869719 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.869738 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.869814 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.869850 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:34Z","lastTransitionTime":"2025-11-27T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.973185 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.973236 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.973288 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.973312 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.973329 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:34Z","lastTransitionTime":"2025-11-27T11:10:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:34 crc kubenswrapper[4807]: I1127 11:10:34.997589 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/3.log" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.002543 4807 scope.go:117] "RemoveContainer" containerID="a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9" Nov 27 11:10:35 crc kubenswrapper[4807]: E1127 11:10:35.002778 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.019213 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.036897 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.057529 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.076210 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.076284 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.076304 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.076327 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.076346 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:35Z","lastTransitionTime":"2025-11-27T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.077381 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.094399 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:26Z\\\",\\\"message\\\":\\\"2025-11-27T11:09:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e\\\\n2025-11-27T11:09:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e to /host/opt/cni/bin/\\\\n2025-11-27T11:09:41Z [verbose] multus-daemon started\\\\n2025-11-27T11:09:41Z [verbose] Readiness Indicator file check\\\\n2025-11-27T11:10:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.123550 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:33Z\\\",\\\"message\\\":\\\"rding success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1127 11:10:33.295482 6781 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-wszmz\\\\nI1127 11:10:33.295490 6781 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1127 11:10:33.295491 6781 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z]\\\\nI1127 11:10:33.295497 6781 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1127 11:10:33.2955\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.147091 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.168226 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.180081 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.180135 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.180152 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.180177 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.180196 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:35Z","lastTransitionTime":"2025-11-27T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.183605 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb849-c866-48ca-a9fb-5f64ce2b6851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5855d42391ccbd3f32e7abf944c071e0912ec43fa3137269b842b95e6907b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.202216 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.220784 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.239880 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.260055 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.280617 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.283590 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.283671 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.283692 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.283717 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.283736 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:35Z","lastTransitionTime":"2025-11-27T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.300941 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.321824 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.340740 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.358423 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d840667c-8233-4eb0-9789-d41a8fb11fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d33545207add221fa61a7c8259b245fa2f114f53ef101d74503d4bbadad20fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a8b1a2034b437496a624127bf754d60abab11457ad19e6e074fe454d0e21b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284dd629e6c81f1232a2d56007a2a7471423b7a601c38bcd3bc264ab9586fc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.386665 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.386725 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.386745 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.386772 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.386789 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:35Z","lastTransitionTime":"2025-11-27T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.391146 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:35Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.489558 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.489624 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.489645 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.489670 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.489689 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:35Z","lastTransitionTime":"2025-11-27T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.532295 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.532298 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.532370 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.532462 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:35 crc kubenswrapper[4807]: E1127 11:10:35.532706 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:35 crc kubenswrapper[4807]: E1127 11:10:35.532873 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:35 crc kubenswrapper[4807]: E1127 11:10:35.533071 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:35 crc kubenswrapper[4807]: E1127 11:10:35.533211 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.592330 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.592386 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.592402 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.592427 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.592444 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:35Z","lastTransitionTime":"2025-11-27T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.695139 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.695212 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.695236 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.695306 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.695332 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:35Z","lastTransitionTime":"2025-11-27T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.798630 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.798696 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.798713 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.798738 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.798756 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:35Z","lastTransitionTime":"2025-11-27T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.902299 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.902362 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.902380 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.902409 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.902426 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:35Z","lastTransitionTime":"2025-11-27T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.998178 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.998240 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.998294 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.998319 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:35 crc kubenswrapper[4807]: I1127 11:10:35.998337 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:35Z","lastTransitionTime":"2025-11-27T11:10:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: E1127 11:10:36.018172 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:35Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:36Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.022689 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.022942 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.023104 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.023235 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.023420 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: E1127 11:10:36.051176 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:36Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.056494 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.056549 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.056613 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.056685 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.056709 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: E1127 11:10:36.078724 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:36Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.084698 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.084913 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.085167 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.085496 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.085703 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: E1127 11:10:36.106055 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:36Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.110098 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.110224 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.110372 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.110472 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.110569 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: E1127 11:10:36.126308 4807 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab395288-9712-459d-800d-cc193ee1f597\\\",\\\"systemUUID\\\":\\\"35d2adeb-2ca2-4bcb-8cf2-6b33d4c3912e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:36Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:36 crc kubenswrapper[4807]: E1127 11:10:36.126497 4807 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.129376 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.129454 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.129469 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.129501 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.129522 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.232321 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.232351 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.232359 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.232371 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.232379 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.335421 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.335481 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.335501 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.335525 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.335541 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.438715 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.438786 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.438804 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.438827 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.438844 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.541080 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.541121 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.541132 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.541147 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.541157 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.644471 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.644513 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.644524 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.644541 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.644551 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.747213 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.747300 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.747317 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.747338 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.747356 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.848911 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.848958 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.849025 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.849046 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.849061 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.951508 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.951536 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.951544 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.951556 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:36 crc kubenswrapper[4807]: I1127 11:10:36.951565 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:36Z","lastTransitionTime":"2025-11-27T11:10:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.053980 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.054043 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.054055 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.054073 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.054085 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:37Z","lastTransitionTime":"2025-11-27T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.156975 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.157019 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.157030 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.157048 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.157059 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:37Z","lastTransitionTime":"2025-11-27T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.259482 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.259525 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.259533 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.259547 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.259556 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:37Z","lastTransitionTime":"2025-11-27T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.362012 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.362066 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.362081 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.362141 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.362156 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:37Z","lastTransitionTime":"2025-11-27T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.464296 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.464334 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.464344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.464360 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.464370 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:37Z","lastTransitionTime":"2025-11-27T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.531807 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:37 crc kubenswrapper[4807]: E1127 11:10:37.531936 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.531948 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:37 crc kubenswrapper[4807]: E1127 11:10:37.532014 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.531941 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.532040 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:37 crc kubenswrapper[4807]: E1127 11:10:37.532092 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:37 crc kubenswrapper[4807]: E1127 11:10:37.532207 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.566302 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.566345 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.566355 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.566373 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.566382 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:37Z","lastTransitionTime":"2025-11-27T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.668284 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.668324 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.668335 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.668351 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.668365 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:37Z","lastTransitionTime":"2025-11-27T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.770877 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.770951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.770970 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.770996 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.771015 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:37Z","lastTransitionTime":"2025-11-27T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.872757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.872803 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.872814 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.872827 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.872836 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:37Z","lastTransitionTime":"2025-11-27T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.974826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.974863 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.974873 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.974887 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:37 crc kubenswrapper[4807]: I1127 11:10:37.974896 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:37Z","lastTransitionTime":"2025-11-27T11:10:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.077611 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.077659 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.077675 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.077698 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.077713 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:38Z","lastTransitionTime":"2025-11-27T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.180059 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.180093 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.180101 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.180116 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.180126 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:38Z","lastTransitionTime":"2025-11-27T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.281741 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.281844 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.281862 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.281887 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.281950 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:38Z","lastTransitionTime":"2025-11-27T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.385042 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.385130 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.385145 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.385167 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.385183 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:38Z","lastTransitionTime":"2025-11-27T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.487883 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.487927 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.487939 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.487955 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.487967 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:38Z","lastTransitionTime":"2025-11-27T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.591197 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.591279 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.591297 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.591319 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.591335 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:38Z","lastTransitionTime":"2025-11-27T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.692915 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.692953 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.692964 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.692979 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.692989 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:38Z","lastTransitionTime":"2025-11-27T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.796105 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.796145 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.796153 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.796169 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.796178 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:38Z","lastTransitionTime":"2025-11-27T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.898315 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.898347 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.898357 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.898372 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:38 crc kubenswrapper[4807]: I1127 11:10:38.898385 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:38Z","lastTransitionTime":"2025-11-27T11:10:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.000757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.000801 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.000812 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.000830 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.000846 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:39Z","lastTransitionTime":"2025-11-27T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.103172 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.103229 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.103239 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.103270 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.103281 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:39Z","lastTransitionTime":"2025-11-27T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.204969 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.204999 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.205008 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.205019 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.205029 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:39Z","lastTransitionTime":"2025-11-27T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.307852 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.307897 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.307907 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.307923 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.307931 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:39Z","lastTransitionTime":"2025-11-27T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.409417 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.409520 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.409545 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.409571 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.409594 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:39Z","lastTransitionTime":"2025-11-27T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.511614 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.511653 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.511662 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.511675 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.511684 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:39Z","lastTransitionTime":"2025-11-27T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.531514 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:39 crc kubenswrapper[4807]: E1127 11:10:39.531667 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.531848 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.531875 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.531910 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:39 crc kubenswrapper[4807]: E1127 11:10:39.532044 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:39 crc kubenswrapper[4807]: E1127 11:10:39.532372 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:39 crc kubenswrapper[4807]: E1127 11:10:39.532429 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.549985 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e63833acde1a26a072cc07e99762406d179ce7d41934afd0e8233e1bffd5c72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78b7ac2df93f4495bf5ffc6348335be2685c598a22467728b71ae69aee4f6f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.560034 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://017746da330245bff540f631867e06d18155c062824d2bd0054a9afc5ffc3958\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.574736 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.585307 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aaae6992-39ea-4c99-b5e5-b4c025ec48f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5498fffc77330ae69a089f29dd40247470f591d5107bf9491b3938961ecfc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dncsr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kk425\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.600041 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8fc25371-fd6f-439c-b3e0-415f96822338\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e3418ac3cdaf32a1788e7f9152c32e91566f9b923f31985bffa94a1a7978f088\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cc5b6e47aa99bf6abfb8fe0c24fc6bd039e1117d892bfd3e1c683315b2def0c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mcnq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-btg9f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.613644 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.613706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.613722 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.614227 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.614283 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:39Z","lastTransitionTime":"2025-11-27T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.618628 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d840667c-8233-4eb0-9789-d41a8fb11fdb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d33545207add221fa61a7c8259b245fa2f114f53ef101d74503d4bbadad20fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88a8b1a2034b437496a624127bf754d60abab11457ad19e6e074fe454d0e21b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://284dd629e6c81f1232a2d56007a2a7471423b7a601c38bcd3bc264ab9586fc53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a497b615ee1244423b05f0860bdcbf3ea616c93e2b00d9d9a1d631946316a207\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.639185 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7e38385b-f81d-40a2-b5d6-e28b85aa9a47\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608f2caad767d90ca31b6ed92f78409ba656039a0110f0c2473808ebf681f0c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d21ba1015d140cf810f6ae2d063179a7efeaee4ba6462119c099d06206683d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54244f1aab4c3fa9aec08bbc9f8b2bb689495161d3dff5354811125beb4f454a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05095ec05d244294c83beb4f62b7bc5b02612c205961d165cf1760417cc66ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae5dda6efc31463c6510891ed370cef09caee79dec42b7fcc22edba6e3b9ef3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06573c13a51ed0b6fc8efb6469e69ee7caf2c697cb336512d1987457af83f9bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3dee037dde3203f05bfcc440e45ab21e9f89751d2b3c701dd3e0276f8b75778e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e07255b20c9be8509c9a77ea37becae98af197b495a5ce7d0472c45eae535b0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.654522 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.663430 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wszmz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"911bce2f-3fb2-484d-870f-d9737047bd10\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sjkg5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wszmz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.678661 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f448045-25f1-4986-8431-48771fd945ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9c8fd7ba060dd7b8895eabc774280b293743e610920aa570419311b70dc13efb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://505d6a81bc34e369c62f7c5d774901a80473a13029481c95d4219cac2883bfba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1accfd8e434819fc63c2cf3e5bba2388a69a34a655414d4e0eb9b2fb8647b45d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.696314 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f4e9a14fc30cccc45960c9d4afbaf9dfda7a03f64391c78b79592a6903f4916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.705277 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5fv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7be9a58d-0876-441d-b6eb-6d0b3412abac\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60002964d0eabdcf2985f6791e3c7df54457315046bb5def72c19dbe5e0d0c13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d7g6r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:41Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5fv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.717132 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.717199 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.717217 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.717286 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.717306 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:39Z","lastTransitionTime":"2025-11-27T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.732406 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c85b740-1df9-4ae7-a51b-fdfd89668d64\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:33Z\\\",\\\"message\\\":\\\"rding success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI1127 11:10:33.295482 6781 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/network-metrics-daemon-wszmz\\\\nI1127 11:10:33.295490 6781 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF1127 11:10:33.295491 6781 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:33Z is after 2025-08-24T17:21:41Z]\\\\nI1127 11:10:33.295497 6781 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1127 11:10:33.2955\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:10:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7nmsn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-lwph9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.747680 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-k6wll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"579992dc-49bf-49ea-ad07-62beba6397df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://769602fea6dda4d9914305042a6253eb81104f1e0ce0f713a03815bcf909697e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e80828a541b49bb79e1a1f1efd40b805f07d2bb1a9cb36ae89e40f35f6a69bed\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c6dd4df10ccdeee995ab96c16f214d3352083b94f40fe3eaf5da3c1a5f1513f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca4a886e0f30ceb4ad876d9d67cf0f10e68b870de35f7f1036ed009f843e3690\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://191663197bab4c126606506d6c0eff72c1728570044e3b36c049ccf03f8672b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://683caccb693002aabe70f657047763b0454a187e7a850a6839ba865010f30feb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ce7def97a812ddf7b9209a889c6082a97298f03725a22046cd7dd8f98a1eae5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w44dv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-k6wll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.767812 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05153e77-990e-4b38-89e3-d4f962674fa9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-27T11:09:33Z\\\",\\\"message\\\":\\\"W1127 11:09:22.514343 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1127 11:09:22.514687 1 crypto.go:601] Generating new CA for check-endpoints-signer@1764241762 cert, and key in /tmp/serving-cert-2651821769/serving-signer.crt, /tmp/serving-cert-2651821769/serving-signer.key\\\\nI1127 11:09:22.759404 1 observer_polling.go:159] Starting file observer\\\\nW1127 11:09:22.764999 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1127 11:09:22.765095 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1127 11:09:22.765605 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2651821769/tls.crt::/tmp/serving-cert-2651821769/tls.key\\\\\\\"\\\\nF1127 11:09:33.218500 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.778820 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d2fb849-c866-48ca-a9fb-5f64ce2b6851\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5855d42391ccbd3f32e7abf944c071e0912ec43fa3137269b842b95e6907b209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8d7a4e29063b23851d4d1835b043247ed65f1b1b981303b931e9e458434cb8bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-27T11:09:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:19Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.790933 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xmngf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"97f15cbb-220e-47db-b418-3a5aa4eb55a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:10:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-27T11:10:26Z\\\",\\\"message\\\":\\\"2025-11-27T11:09:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e\\\\n2025-11-27T11:09:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8eb53c16-a0cf-4377-be1d-396a3d30be1e to /host/opt/cni/bin/\\\\n2025-11-27T11:09:41Z [verbose] multus-daemon started\\\\n2025-11-27T11:09:41Z [verbose] Readiness Indicator file check\\\\n2025-11-27T11:10:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:10:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5bxfj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xmngf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.805068 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.814819 4807 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-d4dvd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e030331a-9097-479c-8226-8553c1423ae4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-27T11:09:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65711bb128dcd561b4fac144866e08bf0ab67bfb108b26d815ee9bd70d5523ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-27T11:09:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ksfsj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-27T11:09:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-d4dvd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-27T11:10:39Z is after 2025-08-24T17:21:41Z" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.819790 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.819822 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.819833 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.819850 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.819863 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:39Z","lastTransitionTime":"2025-11-27T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.922081 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.922123 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.922140 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.922162 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:39 crc kubenswrapper[4807]: I1127 11:10:39.922178 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:39Z","lastTransitionTime":"2025-11-27T11:10:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.024733 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.024778 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.024795 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.024817 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.024833 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:40Z","lastTransitionTime":"2025-11-27T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.126797 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.126842 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.126858 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.126879 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.126894 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:40Z","lastTransitionTime":"2025-11-27T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.228969 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.229047 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.229070 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.229095 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.229111 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:40Z","lastTransitionTime":"2025-11-27T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.332301 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.332357 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.332379 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.332409 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.332430 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:40Z","lastTransitionTime":"2025-11-27T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.435513 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.435557 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.435569 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.435588 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.435599 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:40Z","lastTransitionTime":"2025-11-27T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.537709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.537766 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.537784 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.537805 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.537821 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:40Z","lastTransitionTime":"2025-11-27T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.641293 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.641333 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.641344 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.641360 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.641373 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:40Z","lastTransitionTime":"2025-11-27T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.744871 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.744951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.744974 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.744997 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.745015 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:40Z","lastTransitionTime":"2025-11-27T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.847705 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.847778 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.847802 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.847828 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.847845 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:40Z","lastTransitionTime":"2025-11-27T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.950840 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.950877 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.950894 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.950913 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:40 crc kubenswrapper[4807]: I1127 11:10:40.950925 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:40Z","lastTransitionTime":"2025-11-27T11:10:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.053570 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.053613 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.053623 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.053636 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.053645 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:41Z","lastTransitionTime":"2025-11-27T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.156519 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.156562 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.156577 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.156594 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.156607 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:41Z","lastTransitionTime":"2025-11-27T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.259551 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.259701 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.259721 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.259735 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.259743 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:41Z","lastTransitionTime":"2025-11-27T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.362458 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.362501 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.362513 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.362555 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.362567 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:41Z","lastTransitionTime":"2025-11-27T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.464648 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.464706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.464720 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.464738 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.464752 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:41Z","lastTransitionTime":"2025-11-27T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.531609 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.531838 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.531873 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.531874 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:41 crc kubenswrapper[4807]: E1127 11:10:41.531966 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:41 crc kubenswrapper[4807]: E1127 11:10:41.532047 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:41 crc kubenswrapper[4807]: E1127 11:10:41.532126 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:41 crc kubenswrapper[4807]: E1127 11:10:41.532341 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.568616 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.568658 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.568683 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.568704 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.568719 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:41Z","lastTransitionTime":"2025-11-27T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.672020 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.672083 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.672107 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.672136 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.672157 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:41Z","lastTransitionTime":"2025-11-27T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.775740 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.775794 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.775826 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.775846 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.775860 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:41Z","lastTransitionTime":"2025-11-27T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.878760 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.879124 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.879152 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.879180 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.879203 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:41Z","lastTransitionTime":"2025-11-27T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.981891 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.981928 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.981939 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.981955 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:41 crc kubenswrapper[4807]: I1127 11:10:41.981967 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:41Z","lastTransitionTime":"2025-11-27T11:10:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.084791 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.084844 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.084856 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.084874 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.084886 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:42Z","lastTransitionTime":"2025-11-27T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.189665 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.189737 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.189766 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.189804 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.189827 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:42Z","lastTransitionTime":"2025-11-27T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.292726 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.292791 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.292870 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.292946 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.292965 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:42Z","lastTransitionTime":"2025-11-27T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.395491 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.395580 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.395597 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.395622 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.395639 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:42Z","lastTransitionTime":"2025-11-27T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.498186 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.498298 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.498324 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.498356 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.498379 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:42Z","lastTransitionTime":"2025-11-27T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.601212 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.601306 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.601327 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.601352 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.601372 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:42Z","lastTransitionTime":"2025-11-27T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.704834 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.704892 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.704910 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.704933 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.704950 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:42Z","lastTransitionTime":"2025-11-27T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.807052 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.807092 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.807102 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.807115 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.807123 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:42Z","lastTransitionTime":"2025-11-27T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.873766 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.873913 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.873943 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.874060 4807 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.874073 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.874034729 +0000 UTC m=+147.973532957 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.874125 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.874111661 +0000 UTC m=+147.973609899 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.874147 4807 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.874292 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.874231814 +0000 UTC m=+147.973730122 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.910101 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.910385 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.910404 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.910423 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.910433 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:42Z","lastTransitionTime":"2025-11-27T11:10:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.975016 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:42 crc kubenswrapper[4807]: I1127 11:10:42.975072 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.975238 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.975293 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.975308 4807 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.975402 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.975350322 +0000 UTC m=+148.074848540 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.975450 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.975500 4807 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.975522 4807 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:10:42 crc kubenswrapper[4807]: E1127 11:10:42.975601 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.975574928 +0000 UTC m=+148.075073176 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.012937 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.012988 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.013007 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.013030 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.013048 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:43Z","lastTransitionTime":"2025-11-27T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.115796 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.115868 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.115882 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.115900 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.115915 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:43Z","lastTransitionTime":"2025-11-27T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.218951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.219010 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.219028 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.219051 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.219071 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:43Z","lastTransitionTime":"2025-11-27T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.321905 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.321971 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.321991 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.322016 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.322033 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:43Z","lastTransitionTime":"2025-11-27T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.424830 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.424927 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.424951 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.424985 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.425008 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:43Z","lastTransitionTime":"2025-11-27T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.527929 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.527984 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.528001 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.528025 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.528042 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:43Z","lastTransitionTime":"2025-11-27T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.531462 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.531631 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.531912 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.531971 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:43 crc kubenswrapper[4807]: E1127 11:10:43.532165 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:43 crc kubenswrapper[4807]: E1127 11:10:43.532404 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:43 crc kubenswrapper[4807]: E1127 11:10:43.532502 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:43 crc kubenswrapper[4807]: E1127 11:10:43.532651 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.630978 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.631029 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.631041 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.631058 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.631105 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:43Z","lastTransitionTime":"2025-11-27T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.734159 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.734230 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.734279 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.734314 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.734337 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:43Z","lastTransitionTime":"2025-11-27T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.837278 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.837340 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.837358 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.837383 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.837404 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:43Z","lastTransitionTime":"2025-11-27T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.941016 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.941078 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.941098 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.941123 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:43 crc kubenswrapper[4807]: I1127 11:10:43.941140 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:43Z","lastTransitionTime":"2025-11-27T11:10:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.043639 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.043709 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.043731 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.043757 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.043775 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:44Z","lastTransitionTime":"2025-11-27T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.146945 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.147042 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.147060 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.147085 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.147102 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:44Z","lastTransitionTime":"2025-11-27T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.249872 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.249936 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.249953 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.249977 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.249994 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:44Z","lastTransitionTime":"2025-11-27T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.352455 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.352514 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.352525 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.352541 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.352552 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:44Z","lastTransitionTime":"2025-11-27T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.454966 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.455009 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.455018 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.455031 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.455043 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:44Z","lastTransitionTime":"2025-11-27T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.556995 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.557058 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.557075 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.557153 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.557171 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:44Z","lastTransitionTime":"2025-11-27T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.659581 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.659620 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.659630 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.659646 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.659657 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:44Z","lastTransitionTime":"2025-11-27T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.761427 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.761461 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.761469 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.761481 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.761489 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:44Z","lastTransitionTime":"2025-11-27T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.864549 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.864600 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.864615 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.864637 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.864653 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:44Z","lastTransitionTime":"2025-11-27T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.967076 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.967141 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.967154 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.967172 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:44 crc kubenswrapper[4807]: I1127 11:10:44.967184 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:44Z","lastTransitionTime":"2025-11-27T11:10:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.069290 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.069342 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.069356 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.069378 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.069393 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:45Z","lastTransitionTime":"2025-11-27T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.171666 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.171706 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.171717 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.171733 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.171744 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:45Z","lastTransitionTime":"2025-11-27T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.274432 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.274500 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.274518 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.274543 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.274560 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:45Z","lastTransitionTime":"2025-11-27T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.378192 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.378322 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.378352 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.378387 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.378414 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:45Z","lastTransitionTime":"2025-11-27T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.481817 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.481874 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.481891 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.481913 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.481930 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:45Z","lastTransitionTime":"2025-11-27T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.531740 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.531813 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:45 crc kubenswrapper[4807]: E1127 11:10:45.532062 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.532082 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.532124 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:45 crc kubenswrapper[4807]: E1127 11:10:45.532298 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:45 crc kubenswrapper[4807]: E1127 11:10:45.532384 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.533658 4807 scope.go:117] "RemoveContainer" containerID="a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9" Nov 27 11:10:45 crc kubenswrapper[4807]: E1127 11:10:45.533926 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" Nov 27 11:10:45 crc kubenswrapper[4807]: E1127 11:10:45.534285 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.584739 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.584840 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.584852 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.584876 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.584893 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:45Z","lastTransitionTime":"2025-11-27T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.688009 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.688064 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.688078 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.688097 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.688110 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:45Z","lastTransitionTime":"2025-11-27T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.791236 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.791354 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.791374 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.791400 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.791417 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:45Z","lastTransitionTime":"2025-11-27T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.894237 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.894348 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.894371 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.894399 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.894419 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:45Z","lastTransitionTime":"2025-11-27T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.997777 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.997844 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.997861 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.997885 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:45 crc kubenswrapper[4807]: I1127 11:10:45.997906 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:45Z","lastTransitionTime":"2025-11-27T11:10:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.100425 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.100481 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.100497 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.100520 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.100537 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:46Z","lastTransitionTime":"2025-11-27T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.203379 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.203446 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.203463 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.203487 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.203503 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:46Z","lastTransitionTime":"2025-11-27T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.306087 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.306291 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.306313 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.306339 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.306357 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:46Z","lastTransitionTime":"2025-11-27T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.400091 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.400174 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.400214 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.400293 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.400319 4807 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-27T11:10:46Z","lastTransitionTime":"2025-11-27T11:10:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.471270 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l"] Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.472093 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.475411 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.475817 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.476094 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.476731 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.517038 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-btg9f" podStartSLOduration=67.517009746 podStartE2EDuration="1m7.517009746s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.497542309 +0000 UTC m=+87.597040567" watchObservedRunningTime="2025-11-27 11:10:46.517009746 +0000 UTC m=+87.616507984" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.517228 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.517220422 podStartE2EDuration="35.517220422s" podCreationTimestamp="2025-11-27 11:10:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.516603204 +0000 UTC m=+87.616101402" watchObservedRunningTime="2025-11-27 11:10:46.517220422 +0000 UTC m=+87.616718660" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.559604 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.559582141 podStartE2EDuration="1m7.559582141s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.558754008 +0000 UTC m=+87.658252206" watchObservedRunningTime="2025-11-27 11:10:46.559582141 +0000 UTC m=+87.659080349" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.620759 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/16a48d2f-a73c-4957-802c-cb296193aec0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.620838 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/16a48d2f-a73c-4957-802c-cb296193aec0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.620887 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16a48d2f-a73c-4957-802c-cb296193aec0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.620968 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a48d2f-a73c-4957-802c-cb296193aec0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.621000 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16a48d2f-a73c-4957-802c-cb296193aec0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.664012 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=63.663983111 podStartE2EDuration="1m3.663983111s" podCreationTimestamp="2025-11-27 11:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.663650592 +0000 UTC m=+87.763148810" watchObservedRunningTime="2025-11-27 11:10:46.663983111 +0000 UTC m=+87.763481319" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.664118 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podStartSLOduration=68.664114835 podStartE2EDuration="1m8.664114835s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.647818497 +0000 UTC m=+87.747316725" watchObservedRunningTime="2025-11-27 11:10:46.664114835 +0000 UTC m=+87.763613023" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.703584 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5fv8w" podStartSLOduration=68.703563412 podStartE2EDuration="1m8.703563412s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.693435358 +0000 UTC m=+87.792933566" watchObservedRunningTime="2025-11-27 11:10:46.703563412 +0000 UTC m=+87.803061610" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.722263 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a48d2f-a73c-4957-802c-cb296193aec0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.722313 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16a48d2f-a73c-4957-802c-cb296193aec0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.722340 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/16a48d2f-a73c-4957-802c-cb296193aec0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.722383 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/16a48d2f-a73c-4957-802c-cb296193aec0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.722403 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16a48d2f-a73c-4957-802c-cb296193aec0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.722724 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/16a48d2f-a73c-4957-802c-cb296193aec0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.723103 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/16a48d2f-a73c-4957-802c-cb296193aec0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.723407 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16a48d2f-a73c-4957-802c-cb296193aec0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.729258 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16a48d2f-a73c-4957-802c-cb296193aec0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.736924 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=67.736907688 podStartE2EDuration="1m7.736907688s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.736047264 +0000 UTC m=+87.835545462" watchObservedRunningTime="2025-11-27 11:10:46.736907688 +0000 UTC m=+87.836405886" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.740556 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16a48d2f-a73c-4957-802c-cb296193aec0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-whc6l\" (UID: \"16a48d2f-a73c-4957-802c-cb296193aec0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.748585 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=20.748565775 podStartE2EDuration="20.748565775s" podCreationTimestamp="2025-11-27 11:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.747842335 +0000 UTC m=+87.847340543" watchObservedRunningTime="2025-11-27 11:10:46.748565775 +0000 UTC m=+87.848063993" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.761462 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xmngf" podStartSLOduration=68.761443496 podStartE2EDuration="1m8.761443496s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.759866892 +0000 UTC m=+87.859365100" watchObservedRunningTime="2025-11-27 11:10:46.761443496 +0000 UTC m=+87.860941714" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.788534 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" Nov 27 11:10:46 crc kubenswrapper[4807]: W1127 11:10:46.804232 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16a48d2f_a73c_4957_802c_cb296193aec0.slice/crio-affbdc48c5078f8ef56ebfe6cc9204736043fbc3e1e5b88f4bb198a54dc2cc8d WatchSource:0}: Error finding container affbdc48c5078f8ef56ebfe6cc9204736043fbc3e1e5b88f4bb198a54dc2cc8d: Status 404 returned error can't find the container with id affbdc48c5078f8ef56ebfe6cc9204736043fbc3e1e5b88f4bb198a54dc2cc8d Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.842778 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-k6wll" podStartSLOduration=68.842760449 podStartE2EDuration="1m8.842760449s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.807623503 +0000 UTC m=+87.907121701" watchObservedRunningTime="2025-11-27 11:10:46.842760449 +0000 UTC m=+87.942258657" Nov 27 11:10:46 crc kubenswrapper[4807]: I1127 11:10:46.865513 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d4dvd" podStartSLOduration=68.865496677 podStartE2EDuration="1m8.865496677s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:46.865055445 +0000 UTC m=+87.964553663" watchObservedRunningTime="2025-11-27 11:10:46.865496677 +0000 UTC m=+87.964994885" Nov 27 11:10:47 crc kubenswrapper[4807]: I1127 11:10:47.039363 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" event={"ID":"16a48d2f-a73c-4957-802c-cb296193aec0","Type":"ContainerStarted","Data":"5d28a20535e8dda92ad2042ecc01e233b0b19bc225a99cbf4d1f8bbdb4aa8778"} Nov 27 11:10:47 crc kubenswrapper[4807]: I1127 11:10:47.039421 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" event={"ID":"16a48d2f-a73c-4957-802c-cb296193aec0","Type":"ContainerStarted","Data":"affbdc48c5078f8ef56ebfe6cc9204736043fbc3e1e5b88f4bb198a54dc2cc8d"} Nov 27 11:10:47 crc kubenswrapper[4807]: I1127 11:10:47.051708 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-whc6l" podStartSLOduration=69.051690383 podStartE2EDuration="1m9.051690383s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:10:47.051501278 +0000 UTC m=+88.150999546" watchObservedRunningTime="2025-11-27 11:10:47.051690383 +0000 UTC m=+88.151188611" Nov 27 11:10:47 crc kubenswrapper[4807]: I1127 11:10:47.535442 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:47 crc kubenswrapper[4807]: I1127 11:10:47.535491 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:47 crc kubenswrapper[4807]: I1127 11:10:47.535549 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:47 crc kubenswrapper[4807]: E1127 11:10:47.535689 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:47 crc kubenswrapper[4807]: E1127 11:10:47.535767 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:47 crc kubenswrapper[4807]: E1127 11:10:47.536163 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:47 crc kubenswrapper[4807]: I1127 11:10:47.536686 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:47 crc kubenswrapper[4807]: E1127 11:10:47.536968 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:49 crc kubenswrapper[4807]: I1127 11:10:49.532007 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:49 crc kubenswrapper[4807]: I1127 11:10:49.532020 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:49 crc kubenswrapper[4807]: I1127 11:10:49.532099 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:49 crc kubenswrapper[4807]: I1127 11:10:49.533651 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:49 crc kubenswrapper[4807]: E1127 11:10:49.533632 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:49 crc kubenswrapper[4807]: E1127 11:10:49.533717 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:49 crc kubenswrapper[4807]: E1127 11:10:49.533792 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:49 crc kubenswrapper[4807]: E1127 11:10:49.533904 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:51 crc kubenswrapper[4807]: I1127 11:10:51.531860 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:51 crc kubenswrapper[4807]: I1127 11:10:51.531979 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:51 crc kubenswrapper[4807]: E1127 11:10:51.532015 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:51 crc kubenswrapper[4807]: I1127 11:10:51.531881 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:51 crc kubenswrapper[4807]: I1127 11:10:51.532068 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:51 crc kubenswrapper[4807]: E1127 11:10:51.532676 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:51 crc kubenswrapper[4807]: E1127 11:10:51.532737 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:51 crc kubenswrapper[4807]: E1127 11:10:51.532927 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:53 crc kubenswrapper[4807]: I1127 11:10:53.531926 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:53 crc kubenswrapper[4807]: I1127 11:10:53.532047 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:53 crc kubenswrapper[4807]: I1127 11:10:53.532219 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:53 crc kubenswrapper[4807]: E1127 11:10:53.532204 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:53 crc kubenswrapper[4807]: E1127 11:10:53.532400 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:53 crc kubenswrapper[4807]: E1127 11:10:53.532501 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:53 crc kubenswrapper[4807]: I1127 11:10:53.533304 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:53 crc kubenswrapper[4807]: E1127 11:10:53.533382 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:55 crc kubenswrapper[4807]: I1127 11:10:55.532177 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:55 crc kubenswrapper[4807]: I1127 11:10:55.532234 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:55 crc kubenswrapper[4807]: I1127 11:10:55.532176 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:55 crc kubenswrapper[4807]: I1127 11:10:55.532369 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:55 crc kubenswrapper[4807]: E1127 11:10:55.532554 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:55 crc kubenswrapper[4807]: E1127 11:10:55.532672 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:55 crc kubenswrapper[4807]: E1127 11:10:55.532746 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:55 crc kubenswrapper[4807]: E1127 11:10:55.532839 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:56 crc kubenswrapper[4807]: I1127 11:10:56.425818 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:56 crc kubenswrapper[4807]: E1127 11:10:56.425927 4807 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:10:56 crc kubenswrapper[4807]: E1127 11:10:56.426225 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs podName:911bce2f-3fb2-484d-870f-d9737047bd10 nodeName:}" failed. No retries permitted until 2025-11-27 11:12:00.426207629 +0000 UTC m=+161.525705827 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs") pod "network-metrics-daemon-wszmz" (UID: "911bce2f-3fb2-484d-870f-d9737047bd10") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 27 11:10:57 crc kubenswrapper[4807]: I1127 11:10:57.532123 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:57 crc kubenswrapper[4807]: I1127 11:10:57.532176 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:57 crc kubenswrapper[4807]: E1127 11:10:57.532399 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:10:57 crc kubenswrapper[4807]: I1127 11:10:57.532420 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:57 crc kubenswrapper[4807]: I1127 11:10:57.532697 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:57 crc kubenswrapper[4807]: E1127 11:10:57.532825 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:57 crc kubenswrapper[4807]: I1127 11:10:57.532948 4807 scope.go:117] "RemoveContainer" containerID="a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9" Nov 27 11:10:57 crc kubenswrapper[4807]: E1127 11:10:57.532940 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:57 crc kubenswrapper[4807]: E1127 11:10:57.533030 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:57 crc kubenswrapper[4807]: E1127 11:10:57.533087 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" Nov 27 11:10:59 crc kubenswrapper[4807]: I1127 11:10:59.531865 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:10:59 crc kubenswrapper[4807]: I1127 11:10:59.532041 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:10:59 crc kubenswrapper[4807]: I1127 11:10:59.532087 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:10:59 crc kubenswrapper[4807]: I1127 11:10:59.532144 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:10:59 crc kubenswrapper[4807]: E1127 11:10:59.533033 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:10:59 crc kubenswrapper[4807]: E1127 11:10:59.533141 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:10:59 crc kubenswrapper[4807]: E1127 11:10:59.533240 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:10:59 crc kubenswrapper[4807]: E1127 11:10:59.533338 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:01 crc kubenswrapper[4807]: I1127 11:11:01.532325 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:01 crc kubenswrapper[4807]: E1127 11:11:01.533122 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:01 crc kubenswrapper[4807]: I1127 11:11:01.532517 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:01 crc kubenswrapper[4807]: E1127 11:11:01.533448 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:01 crc kubenswrapper[4807]: I1127 11:11:01.532598 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:01 crc kubenswrapper[4807]: E1127 11:11:01.533625 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:01 crc kubenswrapper[4807]: I1127 11:11:01.532477 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:01 crc kubenswrapper[4807]: E1127 11:11:01.534021 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:03 crc kubenswrapper[4807]: I1127 11:11:03.531869 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:03 crc kubenswrapper[4807]: I1127 11:11:03.531886 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:03 crc kubenswrapper[4807]: I1127 11:11:03.531872 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:03 crc kubenswrapper[4807]: I1127 11:11:03.531942 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:03 crc kubenswrapper[4807]: E1127 11:11:03.532108 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:03 crc kubenswrapper[4807]: E1127 11:11:03.532373 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:03 crc kubenswrapper[4807]: E1127 11:11:03.532641 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:03 crc kubenswrapper[4807]: E1127 11:11:03.532738 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:05 crc kubenswrapper[4807]: I1127 11:11:05.531910 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:05 crc kubenswrapper[4807]: I1127 11:11:05.531969 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:05 crc kubenswrapper[4807]: I1127 11:11:05.532000 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:05 crc kubenswrapper[4807]: E1127 11:11:05.532090 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:05 crc kubenswrapper[4807]: I1127 11:11:05.532166 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:05 crc kubenswrapper[4807]: E1127 11:11:05.532404 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:05 crc kubenswrapper[4807]: E1127 11:11:05.532471 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:05 crc kubenswrapper[4807]: E1127 11:11:05.532600 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:07 crc kubenswrapper[4807]: I1127 11:11:07.532018 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:07 crc kubenswrapper[4807]: I1127 11:11:07.532063 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:07 crc kubenswrapper[4807]: I1127 11:11:07.532151 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:07 crc kubenswrapper[4807]: E1127 11:11:07.532309 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:07 crc kubenswrapper[4807]: I1127 11:11:07.532333 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:07 crc kubenswrapper[4807]: E1127 11:11:07.532556 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:07 crc kubenswrapper[4807]: E1127 11:11:07.532751 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:07 crc kubenswrapper[4807]: E1127 11:11:07.532871 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:09 crc kubenswrapper[4807]: I1127 11:11:09.532129 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:09 crc kubenswrapper[4807]: I1127 11:11:09.532125 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:09 crc kubenswrapper[4807]: I1127 11:11:09.532138 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:09 crc kubenswrapper[4807]: I1127 11:11:09.532182 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:09 crc kubenswrapper[4807]: E1127 11:11:09.534273 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:09 crc kubenswrapper[4807]: E1127 11:11:09.534400 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:09 crc kubenswrapper[4807]: E1127 11:11:09.534565 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:09 crc kubenswrapper[4807]: E1127 11:11:09.534655 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:11 crc kubenswrapper[4807]: I1127 11:11:11.531722 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:11 crc kubenswrapper[4807]: E1127 11:11:11.531903 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:11 crc kubenswrapper[4807]: I1127 11:11:11.531962 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:11 crc kubenswrapper[4807]: I1127 11:11:11.531995 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:11 crc kubenswrapper[4807]: I1127 11:11:11.532057 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:11 crc kubenswrapper[4807]: E1127 11:11:11.532196 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:11 crc kubenswrapper[4807]: E1127 11:11:11.532915 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:11 crc kubenswrapper[4807]: E1127 11:11:11.533067 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:11 crc kubenswrapper[4807]: I1127 11:11:11.533390 4807 scope.go:117] "RemoveContainer" containerID="a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9" Nov 27 11:11:11 crc kubenswrapper[4807]: E1127 11:11:11.533690 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-lwph9_openshift-ovn-kubernetes(9c85b740-1df9-4ae7-a51b-fdfd89668d64)\"" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" Nov 27 11:11:13 crc kubenswrapper[4807]: I1127 11:11:13.531515 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:13 crc kubenswrapper[4807]: I1127 11:11:13.531519 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:13 crc kubenswrapper[4807]: E1127 11:11:13.531694 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:13 crc kubenswrapper[4807]: E1127 11:11:13.531867 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:13 crc kubenswrapper[4807]: I1127 11:11:13.532498 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:13 crc kubenswrapper[4807]: E1127 11:11:13.532565 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:13 crc kubenswrapper[4807]: I1127 11:11:13.532725 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:13 crc kubenswrapper[4807]: E1127 11:11:13.533018 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:14 crc kubenswrapper[4807]: I1127 11:11:14.168999 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmngf_97f15cbb-220e-47db-b418-3a5aa4eb55a2/kube-multus/1.log" Nov 27 11:11:14 crc kubenswrapper[4807]: I1127 11:11:14.169837 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmngf_97f15cbb-220e-47db-b418-3a5aa4eb55a2/kube-multus/0.log" Nov 27 11:11:14 crc kubenswrapper[4807]: I1127 11:11:14.169927 4807 generic.go:334] "Generic (PLEG): container finished" podID="97f15cbb-220e-47db-b418-3a5aa4eb55a2" containerID="a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624" exitCode=1 Nov 27 11:11:14 crc kubenswrapper[4807]: I1127 11:11:14.169980 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmngf" event={"ID":"97f15cbb-220e-47db-b418-3a5aa4eb55a2","Type":"ContainerDied","Data":"a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624"} Nov 27 11:11:14 crc kubenswrapper[4807]: I1127 11:11:14.170038 4807 scope.go:117] "RemoveContainer" containerID="396d78039384059be50eb4daa1b01b93b843c93971dc2962546eba87a1b1af64" Nov 27 11:11:14 crc kubenswrapper[4807]: I1127 11:11:14.170703 4807 scope.go:117] "RemoveContainer" containerID="a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624" Nov 27 11:11:14 crc kubenswrapper[4807]: E1127 11:11:14.171149 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xmngf_openshift-multus(97f15cbb-220e-47db-b418-3a5aa4eb55a2)\"" pod="openshift-multus/multus-xmngf" podUID="97f15cbb-220e-47db-b418-3a5aa4eb55a2" Nov 27 11:11:15 crc kubenswrapper[4807]: I1127 11:11:15.175300 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmngf_97f15cbb-220e-47db-b418-3a5aa4eb55a2/kube-multus/1.log" Nov 27 11:11:15 crc kubenswrapper[4807]: I1127 11:11:15.531462 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:15 crc kubenswrapper[4807]: I1127 11:11:15.531513 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:15 crc kubenswrapper[4807]: E1127 11:11:15.531830 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:15 crc kubenswrapper[4807]: I1127 11:11:15.531900 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:15 crc kubenswrapper[4807]: E1127 11:11:15.532057 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:15 crc kubenswrapper[4807]: E1127 11:11:15.532198 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:15 crc kubenswrapper[4807]: I1127 11:11:15.532458 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:15 crc kubenswrapper[4807]: E1127 11:11:15.532626 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:17 crc kubenswrapper[4807]: I1127 11:11:17.531330 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:17 crc kubenswrapper[4807]: I1127 11:11:17.531381 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:17 crc kubenswrapper[4807]: I1127 11:11:17.531391 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:17 crc kubenswrapper[4807]: I1127 11:11:17.531561 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:17 crc kubenswrapper[4807]: E1127 11:11:17.531549 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:17 crc kubenswrapper[4807]: E1127 11:11:17.531736 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:17 crc kubenswrapper[4807]: E1127 11:11:17.531819 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:17 crc kubenswrapper[4807]: E1127 11:11:17.531988 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:19 crc kubenswrapper[4807]: E1127 11:11:19.492704 4807 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 27 11:11:19 crc kubenswrapper[4807]: I1127 11:11:19.531875 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:19 crc kubenswrapper[4807]: I1127 11:11:19.531879 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:19 crc kubenswrapper[4807]: I1127 11:11:19.531940 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:19 crc kubenswrapper[4807]: I1127 11:11:19.531965 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:19 crc kubenswrapper[4807]: E1127 11:11:19.532883 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:19 crc kubenswrapper[4807]: E1127 11:11:19.533016 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:19 crc kubenswrapper[4807]: E1127 11:11:19.533169 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:19 crc kubenswrapper[4807]: E1127 11:11:19.533275 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:19 crc kubenswrapper[4807]: E1127 11:11:19.623197 4807 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 27 11:11:21 crc kubenswrapper[4807]: I1127 11:11:21.532431 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:21 crc kubenswrapper[4807]: I1127 11:11:21.532494 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:21 crc kubenswrapper[4807]: I1127 11:11:21.532360 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:21 crc kubenswrapper[4807]: I1127 11:11:21.532452 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:21 crc kubenswrapper[4807]: E1127 11:11:21.532658 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:21 crc kubenswrapper[4807]: E1127 11:11:21.532939 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:21 crc kubenswrapper[4807]: E1127 11:11:21.532998 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:21 crc kubenswrapper[4807]: E1127 11:11:21.532891 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:23 crc kubenswrapper[4807]: I1127 11:11:23.531989 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:23 crc kubenswrapper[4807]: I1127 11:11:23.532088 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:23 crc kubenswrapper[4807]: E1127 11:11:23.532156 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:23 crc kubenswrapper[4807]: I1127 11:11:23.532182 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:23 crc kubenswrapper[4807]: E1127 11:11:23.532385 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:23 crc kubenswrapper[4807]: I1127 11:11:23.532464 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:23 crc kubenswrapper[4807]: E1127 11:11:23.532632 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:23 crc kubenswrapper[4807]: E1127 11:11:23.532724 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:24 crc kubenswrapper[4807]: I1127 11:11:24.532959 4807 scope.go:117] "RemoveContainer" containerID="a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9" Nov 27 11:11:24 crc kubenswrapper[4807]: E1127 11:11:24.624494 4807 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 27 11:11:25 crc kubenswrapper[4807]: I1127 11:11:25.207625 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/3.log" Nov 27 11:11:25 crc kubenswrapper[4807]: I1127 11:11:25.210283 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerStarted","Data":"56ff2fa9700366e563601890e830cfce95805680d32c1d0bc0fa275c8cf55984"} Nov 27 11:11:25 crc kubenswrapper[4807]: I1127 11:11:25.211158 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:11:25 crc kubenswrapper[4807]: I1127 11:11:25.430478 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podStartSLOduration=107.430457673 podStartE2EDuration="1m47.430457673s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:25.246690775 +0000 UTC m=+126.346189003" watchObservedRunningTime="2025-11-27 11:11:25.430457673 +0000 UTC m=+126.529955881" Nov 27 11:11:25 crc kubenswrapper[4807]: I1127 11:11:25.431213 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wszmz"] Nov 27 11:11:25 crc kubenswrapper[4807]: I1127 11:11:25.431327 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:25 crc kubenswrapper[4807]: E1127 11:11:25.431426 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:25 crc kubenswrapper[4807]: I1127 11:11:25.532116 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:25 crc kubenswrapper[4807]: I1127 11:11:25.532177 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:25 crc kubenswrapper[4807]: E1127 11:11:25.532282 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:25 crc kubenswrapper[4807]: E1127 11:11:25.532406 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:25 crc kubenswrapper[4807]: I1127 11:11:25.533433 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:25 crc kubenswrapper[4807]: E1127 11:11:25.533606 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:27 crc kubenswrapper[4807]: I1127 11:11:27.531675 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:27 crc kubenswrapper[4807]: E1127 11:11:27.531828 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:27 crc kubenswrapper[4807]: I1127 11:11:27.532354 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:27 crc kubenswrapper[4807]: I1127 11:11:27.532420 4807 scope.go:117] "RemoveContainer" containerID="a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624" Nov 27 11:11:27 crc kubenswrapper[4807]: E1127 11:11:27.532463 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:27 crc kubenswrapper[4807]: I1127 11:11:27.532607 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:27 crc kubenswrapper[4807]: E1127 11:11:27.532646 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:27 crc kubenswrapper[4807]: I1127 11:11:27.532675 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:27 crc kubenswrapper[4807]: E1127 11:11:27.532834 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:28 crc kubenswrapper[4807]: I1127 11:11:28.225996 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmngf_97f15cbb-220e-47db-b418-3a5aa4eb55a2/kube-multus/1.log" Nov 27 11:11:28 crc kubenswrapper[4807]: I1127 11:11:28.226085 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmngf" event={"ID":"97f15cbb-220e-47db-b418-3a5aa4eb55a2","Type":"ContainerStarted","Data":"abc43243ac432a6c5ac5ce257d5f7461ab581a61f2fd55bf1613a430d20c13c4"} Nov 27 11:11:29 crc kubenswrapper[4807]: I1127 11:11:29.531828 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:29 crc kubenswrapper[4807]: I1127 11:11:29.531835 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:29 crc kubenswrapper[4807]: I1127 11:11:29.531886 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:29 crc kubenswrapper[4807]: I1127 11:11:29.531897 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:29 crc kubenswrapper[4807]: E1127 11:11:29.532775 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 27 11:11:29 crc kubenswrapper[4807]: E1127 11:11:29.532825 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 27 11:11:29 crc kubenswrapper[4807]: E1127 11:11:29.532889 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wszmz" podUID="911bce2f-3fb2-484d-870f-d9737047bd10" Nov 27 11:11:29 crc kubenswrapper[4807]: E1127 11:11:29.533021 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 27 11:11:31 crc kubenswrapper[4807]: I1127 11:11:31.531700 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:31 crc kubenswrapper[4807]: I1127 11:11:31.531824 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:11:31 crc kubenswrapper[4807]: I1127 11:11:31.532073 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:31 crc kubenswrapper[4807]: I1127 11:11:31.532173 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:31 crc kubenswrapper[4807]: I1127 11:11:31.535046 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 27 11:11:31 crc kubenswrapper[4807]: I1127 11:11:31.535579 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 27 11:11:31 crc kubenswrapper[4807]: I1127 11:11:31.535691 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 27 11:11:31 crc kubenswrapper[4807]: I1127 11:11:31.536304 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 27 11:11:31 crc kubenswrapper[4807]: I1127 11:11:31.536419 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 27 11:11:31 crc kubenswrapper[4807]: I1127 11:11:31.536608 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 27 11:11:34 crc kubenswrapper[4807]: I1127 11:11:34.121012 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.529441 4807 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.576004 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4kzvt"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.576528 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.578112 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pvv9r"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.578702 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.579142 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.579730 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.580981 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wk6hj"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.581565 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.584849 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qsdql"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.585371 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.590611 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.591196 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592071 4807 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592100 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-service-ca": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-service-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592117 4807 reflector.go:561] object-"openshift-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592135 4807 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592177 4807 reflector.go:561] object-"openshift-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592108 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592178 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592145 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-service-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-service-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592200 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592279 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592155 4807 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592375 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592565 4807 reflector.go:561] object-"openshift-oauth-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592586 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592604 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-login": failed to list *v1.Secret: secrets "v4-0-config-user-template-login" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592630 4807 reflector.go:561] object-"openshift-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592635 4807 reflector.go:561] object-"openshift-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592635 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-login\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592653 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data": failed to list *v1.Secret: secrets "v4-0-config-user-idp-0-file-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592603 4807 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592668 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-provider-selection": failed to list *v1.Secret: secrets "v4-0-config-user-template-provider-selection" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592681 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-idp-0-file-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592663 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592685 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592641 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592713 4807 reflector.go:561] object-"openshift-apiserver"/"image-import-ca": failed to list *v1.ConfigMap: configmaps "image-import-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592703 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-provider-selection\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-provider-selection\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592738 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"image-import-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"image-import-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592664 4807 reflector.go:561] object-"openshift-authentication"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592756 4807 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-config": failed to list *v1.ConfigMap: configmaps "machine-approver-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592771 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592786 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-approver-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592841 4807 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592865 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.592958 4807 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.592988 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.593573 4807 reflector.go:561] object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc": failed to list *v1.Secret: secrets "oauth-openshift-dockercfg-znhcc" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.593595 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"oauth-openshift-dockercfg-znhcc\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-openshift-dockercfg-znhcc\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.593949 4807 reflector.go:561] object-"openshift-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.593969 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.593986 4807 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.594000 4807 reflector.go:561] object-"openshift-oauth-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.594013 4807 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.594012 4807 reflector.go:561] object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.594025 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.594024 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.594012 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.594048 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.594010 4807 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.594090 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.594109 4807 reflector.go:561] object-"openshift-authentication"/"audit": failed to list *v1.ConfigMap: configmaps "audit" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.594135 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"audit\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.594146 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-router-certs": failed to list *v1.Secret: secrets "v4-0-config-system-router-certs" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.594160 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-router-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-router-certs\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.594168 4807 reflector.go:561] object-"openshift-cluster-machine-approver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.594217 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.594174 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-cliconfig": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-cliconfig" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.594271 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-cliconfig\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.594182 4807 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4": failed to list *v1.Secret: secrets "machine-approver-sa-dockercfg-nl2j4" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.594297 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-sa-dockercfg-nl2j4\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-sa-dockercfg-nl2j4\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.595331 4807 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.595361 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.595422 4807 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.595429 4807 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.595450 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.595460 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.595466 4807 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.595490 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.595987 4807 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.596016 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.596066 4807 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.596068 4807 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.596093 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.596106 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.596741 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8"] Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.596810 4807 reflector.go:561] object-"openshift-oauth-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.596832 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.597322 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.598102 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-session": failed to list *v1.Secret: secrets "v4-0-config-system-session" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.598114 4807 reflector.go:561] object-"openshift-oauth-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.598137 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-session\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-session\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.598147 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.598673 4807 reflector.go:561] object-"openshift-oauth-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.598708 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.598671 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kp2d4"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.603730 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w4ddg"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.604811 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jbssf"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.605792 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jbssf" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.606234 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.607059 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.610276 4807 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.610339 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.610441 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-serving-cert": failed to list *v1.Secret: secrets "v4-0-config-system-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.610491 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.610571 4807 reflector.go:561] object-"openshift-oauth-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.610590 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.610912 4807 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.610944 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.612711 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-user-template-error": failed to list *v1.Secret: secrets "v4-0-config-user-template-error" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.612754 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-error\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-user-template-error\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.612824 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "v4-0-config-system-trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.612846 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"v4-0-config-system-trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.612915 4807 reflector.go:561] object-"openshift-authentication"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.612946 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.613012 4807 reflector.go:561] object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template": failed to list *v1.Secret: secrets "v4-0-config-system-ocp-branding-template" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.613031 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"v4-0-config-system-ocp-branding-template\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.613166 4807 reflector.go:561] object-"openshift-apiserver"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.613226 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.613102 4807 reflector.go:561] object-"openshift-cluster-machine-approver"/"machine-approver-tls": failed to list *v1.Secret: secrets "machine-approver-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-cluster-machine-approver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.613352 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-cluster-machine-approver\"/\"machine-approver-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-approver-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-cluster-machine-approver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.613372 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.613431 4807 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.613737 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.613577 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.613578 4807 reflector.go:561] object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx": failed to list *v1.Secret: secrets "cluster-image-registry-operator-dockercfg-m4qtx" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.613896 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"cluster-image-registry-operator-dockercfg-m4qtx\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cluster-image-registry-operator-dockercfg-m4qtx\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.613627 4807 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.613920 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.613645 4807 reflector.go:561] object-"openshift-image-registry"/"trusted-ca": failed to list *v1.ConfigMap: configmaps "trusted-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.613955 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"trusted-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.613687 4807 reflector.go:561] object-"openshift-image-registry"/"image-registry-operator-tls": failed to list *v1.Secret: secrets "image-registry-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.613977 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"image-registry-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"image-registry-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: W1127 11:11:37.613691 4807 reflector.go:561] object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq": failed to list *v1.Secret: secrets "oauth-apiserver-sa-dockercfg-6r2bq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Nov 27 11:11:37 crc kubenswrapper[4807]: E1127 11:11:37.614001 4807 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-6r2bq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-apiserver-sa-dockercfg-6r2bq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.614507 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.614697 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.614875 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.617452 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.618289 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.623291 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.623947 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-m2dvt"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.624124 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.624370 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.624804 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.625213 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.625613 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.626352 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.626504 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.628387 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.628639 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.628785 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.629380 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.631028 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sdd69"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.636725 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.638566 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.643698 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.645364 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.645709 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.646111 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9jvwr"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.646567 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.646893 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.647188 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.647476 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.647639 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.648513 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.648794 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.649056 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.649546 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.654105 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.654195 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.649717 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.649631 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.654213 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.649787 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.651644 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.649691 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.651753 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.651787 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.651720 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.651821 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.651893 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.655503 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.651856 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.655808 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.651960 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.651992 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.652049 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.652276 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.653497 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.653613 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.653785 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.653878 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.651928 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.659831 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.660028 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.660168 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.663041 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.664133 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.666886 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.667240 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-jdsqc"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.667474 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.667596 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.698924 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-7kpfm"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.699774 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pvv9r"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.699802 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4kzvt"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.699817 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kp2d4"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.699832 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jbssf"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.699950 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.700068 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-serving-cert\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.700815 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.704988 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhzm2\" (UniqueName: \"kubernetes.io/projected/b61281aa-db40-457c-b669-35a825b77716-kube-api-access-vhzm2\") pod \"dns-operator-744455d44c-w4ddg\" (UID: \"b61281aa-db40-457c-b669-35a825b77716\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705048 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-policies\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705071 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065b7e1c-110e-4b40-bd1f-c507f94828be-config\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705102 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705135 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705166 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/168d38aa-a194-4c8e-9717-62f2d5fca760-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705190 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-serving-cert\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705223 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pm8v\" (UniqueName: \"kubernetes.io/projected/168d38aa-a194-4c8e-9717-62f2d5fca760-kube-api-access-7pm8v\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705288 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-dir\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705320 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705348 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-auth-proxy-config\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705371 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705400 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-client\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705426 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e163b55-77cb-4055-9d4d-bef2d98d0139-node-pullsecrets\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705456 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5t6m\" (UniqueName: \"kubernetes.io/projected/a0a86502-013f-4060-91d7-cf5cd9353ccf-kube-api-access-l5t6m\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705478 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-config\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705535 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tk46\" (UniqueName: \"kubernetes.io/projected/e38d5152-81eb-46c2-9753-84286838528f-kube-api-access-2tk46\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705560 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e163b55-77cb-4055-9d4d-bef2d98d0139-audit-dir\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705583 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-config\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705609 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705636 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4c84\" (UniqueName: \"kubernetes.io/projected/7462250b-699f-4fff-9600-8dff49efc2e8-kube-api-access-z4c84\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705671 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-config\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705714 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-images\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705746 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-serving-ca\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705773 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705801 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/065b7e1c-110e-4b40-bd1f-c507f94828be-trusted-ca\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705825 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38d5152-81eb-46c2-9753-84286838528f-serving-cert\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705853 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hftrd\" (UniqueName: \"kubernetes.io/projected/6711e08e-05e5-4111-9e1e-fd5ed988a718-kube-api-access-hftrd\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705879 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcfmc\" (UniqueName: \"kubernetes.io/projected/065b7e1c-110e-4b40-bd1f-c507f94828be-kube-api-access-gcfmc\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705904 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705929 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-encryption-config\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705954 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.705992 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zjcm\" (UniqueName: \"kubernetes.io/projected/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-kube-api-access-5zjcm\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706016 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-audit\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706045 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706069 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706095 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706121 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/065b7e1c-110e-4b40-bd1f-c507f94828be-serving-cert\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706158 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/168d38aa-a194-4c8e-9717-62f2d5fca760-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706184 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/168d38aa-a194-4c8e-9717-62f2d5fca760-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706208 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706237 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706278 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-image-import-ca\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706304 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706330 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706358 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-config\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706397 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-client\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706420 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-encryption-config\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706444 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b61281aa-db40-457c-b669-35a825b77716-metrics-tls\") pod \"dns-operator-744455d44c-w4ddg\" (UID: \"b61281aa-db40-457c-b669-35a825b77716\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706460 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706482 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a0a86502-013f-4060-91d7-cf5cd9353ccf-machine-approver-tls\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706501 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvv9z\" (UniqueName: \"kubernetes.io/projected/7e163b55-77cb-4055-9d4d-bef2d98d0139-kube-api-access-tvv9z\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706521 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706539 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7462250b-699f-4fff-9600-8dff49efc2e8-audit-dir\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706556 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x225x\" (UniqueName: \"kubernetes.io/projected/3daa1b2f-7da1-475f-8807-299bcf8423ca-kube-api-access-x225x\") pod \"downloads-7954f5f757-jbssf\" (UID: \"3daa1b2f-7da1-475f-8807-299bcf8423ca\") " pod="openshift-console/downloads-7954f5f757-jbssf" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.706574 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.709211 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lhr72"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.713163 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.713304 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.714157 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.714161 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.715343 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.715343 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.715876 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.717190 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.719976 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.720665 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.722396 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.723032 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.739228 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.739664 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-82ntv"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.740197 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.740632 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.741114 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.741806 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.743507 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.743719 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.743934 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.744346 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.744447 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.745343 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.745874 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.746065 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.752352 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.757501 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.758180 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.758745 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.759964 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.760522 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-m2dvt"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.761888 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.763232 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wk6hj"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.764711 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.766874 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.767704 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.768177 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-82ntv"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.769333 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rqbbk"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.770785 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.770895 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.771598 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.773144 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.774534 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w4ddg"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.775841 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.777370 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.777731 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.778786 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sdd69"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.779931 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qsdql"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.781419 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.783163 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jdsqc"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.784409 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-88rr6"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.784972 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.785640 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.786674 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.787908 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.789086 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lhr72"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.790089 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.791039 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.792501 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9jvwr"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.794153 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.795125 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.796125 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.797077 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.798164 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rqbbk"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.798777 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.799290 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.801832 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.807354 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-policies\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.807431 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-serving-cert\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.807457 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhzm2\" (UniqueName: \"kubernetes.io/projected/b61281aa-db40-457c-b669-35a825b77716-kube-api-access-vhzm2\") pod \"dns-operator-744455d44c-w4ddg\" (UID: \"b61281aa-db40-457c-b669-35a825b77716\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.807512 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065b7e1c-110e-4b40-bd1f-c507f94828be-config\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.807791 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rqc9w"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808521 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/065b7e1c-110e-4b40-bd1f-c507f94828be-config\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808562 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808586 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808608 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49510dea-c289-487e-a43e-9c8c314afd82-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-68g9f\" (UID: \"49510dea-c289-487e-a43e-9c8c314afd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808626 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/168d38aa-a194-4c8e-9717-62f2d5fca760-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808643 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-serving-cert\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808662 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pm8v\" (UniqueName: \"kubernetes.io/projected/168d38aa-a194-4c8e-9717-62f2d5fca760-kube-api-access-7pm8v\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808677 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-dir\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808692 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/985db6fe-6579-439b-8fc6-b65d265991d4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2fmnh\" (UID: \"985db6fe-6579-439b-8fc6-b65d265991d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808710 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-auth-proxy-config\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808725 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808741 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808757 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-client\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808774 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e163b55-77cb-4055-9d4d-bef2d98d0139-node-pullsecrets\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808793 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5t6m\" (UniqueName: \"kubernetes.io/projected/a0a86502-013f-4060-91d7-cf5cd9353ccf-kube-api-access-l5t6m\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808810 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-config\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808811 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-dir\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808837 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-config\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808855 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tk46\" (UniqueName: \"kubernetes.io/projected/e38d5152-81eb-46c2-9753-84286838528f-kube-api-access-2tk46\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808874 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e163b55-77cb-4055-9d4d-bef2d98d0139-audit-dir\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808891 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808919 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/985db6fe-6579-439b-8fc6-b65d265991d4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2fmnh\" (UID: \"985db6fe-6579-439b-8fc6-b65d265991d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.808776 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809192 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58m6x\" (UniqueName: \"kubernetes.io/projected/985db6fe-6579-439b-8fc6-b65d265991d4-kube-api-access-58m6x\") pod \"openshift-apiserver-operator-796bbdcf4f-2fmnh\" (UID: \"985db6fe-6579-439b-8fc6-b65d265991d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809258 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49510dea-c289-487e-a43e-9c8c314afd82-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-68g9f\" (UID: \"49510dea-c289-487e-a43e-9c8c314afd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809288 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-config\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809354 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4c84\" (UniqueName: \"kubernetes.io/projected/7462250b-699f-4fff-9600-8dff49efc2e8-kube-api-access-z4c84\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809429 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7e163b55-77cb-4055-9d4d-bef2d98d0139-audit-dir\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809454 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p9f4b"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809505 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7e163b55-77cb-4055-9d4d-bef2d98d0139-node-pullsecrets\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809507 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-images\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809582 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-serving-ca\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809741 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809856 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/065b7e1c-110e-4b40-bd1f-c507f94828be-trusted-ca\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809878 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38d5152-81eb-46c2-9753-84286838528f-serving-cert\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809901 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hftrd\" (UniqueName: \"kubernetes.io/projected/6711e08e-05e5-4111-9e1e-fd5ed988a718-kube-api-access-hftrd\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809926 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcfmc\" (UniqueName: \"kubernetes.io/projected/065b7e1c-110e-4b40-bd1f-c507f94828be-kube-api-access-gcfmc\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809941 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rqc9w"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809948 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809972 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-encryption-config\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.809994 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810009 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p9f4b" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810031 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zjcm\" (UniqueName: \"kubernetes.io/projected/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-kube-api-access-5zjcm\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810054 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-audit\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810077 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810099 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810123 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/065b7e1c-110e-4b40-bd1f-c507f94828be-serving-cert\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810146 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810170 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49510dea-c289-487e-a43e-9c8c314afd82-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-68g9f\" (UID: \"49510dea-c289-487e-a43e-9c8c314afd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810209 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/168d38aa-a194-4c8e-9717-62f2d5fca760-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810263 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/168d38aa-a194-4c8e-9717-62f2d5fca760-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810298 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810322 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810341 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-image-import-ca\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810362 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810384 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810416 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-config\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810604 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-client\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810637 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-encryption-config\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810678 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b61281aa-db40-457c-b669-35a825b77716-metrics-tls\") pod \"dns-operator-744455d44c-w4ddg\" (UID: \"b61281aa-db40-457c-b669-35a825b77716\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810699 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810799 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a0a86502-013f-4060-91d7-cf5cd9353ccf-machine-approver-tls\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.810904 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvv9z\" (UniqueName: \"kubernetes.io/projected/7e163b55-77cb-4055-9d4d-bef2d98d0139-kube-api-access-tvv9z\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.811020 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.811038 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p9f4b"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.811097 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7462250b-699f-4fff-9600-8dff49efc2e8-audit-dir\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.811050 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7462250b-699f-4fff-9600-8dff49efc2e8-audit-dir\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.811154 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x225x\" (UniqueName: \"kubernetes.io/projected/3daa1b2f-7da1-475f-8807-299bcf8423ca-kube-api-access-x225x\") pod \"downloads-7954f5f757-jbssf\" (UID: \"3daa1b2f-7da1-475f-8807-299bcf8423ca\") " pod="openshift-console/downloads-7954f5f757-jbssf" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.811178 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.811705 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/065b7e1c-110e-4b40-bd1f-c507f94828be-trusted-ca\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.816725 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b61281aa-db40-457c-b669-35a825b77716-metrics-tls\") pod \"dns-operator-744455d44c-w4ddg\" (UID: \"b61281aa-db40-457c-b669-35a825b77716\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.816776 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/065b7e1c-110e-4b40-bd1f-c507f94828be-serving-cert\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.818049 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.830811 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jvchs"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.832084 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.838635 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.839106 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jvchs"] Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.858565 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.878202 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.898763 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.912048 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49510dea-c289-487e-a43e-9c8c314afd82-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-68g9f\" (UID: \"49510dea-c289-487e-a43e-9c8c314afd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.912091 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/985db6fe-6579-439b-8fc6-b65d265991d4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2fmnh\" (UID: \"985db6fe-6579-439b-8fc6-b65d265991d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.912172 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/985db6fe-6579-439b-8fc6-b65d265991d4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2fmnh\" (UID: \"985db6fe-6579-439b-8fc6-b65d265991d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.912189 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58m6x\" (UniqueName: \"kubernetes.io/projected/985db6fe-6579-439b-8fc6-b65d265991d4-kube-api-access-58m6x\") pod \"openshift-apiserver-operator-796bbdcf4f-2fmnh\" (UID: \"985db6fe-6579-439b-8fc6-b65d265991d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.912208 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49510dea-c289-487e-a43e-9c8c314afd82-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-68g9f\" (UID: \"49510dea-c289-487e-a43e-9c8c314afd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.912354 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49510dea-c289-487e-a43e-9c8c314afd82-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-68g9f\" (UID: \"49510dea-c289-487e-a43e-9c8c314afd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.913035 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49510dea-c289-487e-a43e-9c8c314afd82-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-68g9f\" (UID: \"49510dea-c289-487e-a43e-9c8c314afd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.913087 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/985db6fe-6579-439b-8fc6-b65d265991d4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-2fmnh\" (UID: \"985db6fe-6579-439b-8fc6-b65d265991d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.915553 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49510dea-c289-487e-a43e-9c8c314afd82-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-68g9f\" (UID: \"49510dea-c289-487e-a43e-9c8c314afd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.915878 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/985db6fe-6579-439b-8fc6-b65d265991d4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-2fmnh\" (UID: \"985db6fe-6579-439b-8fc6-b65d265991d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.918525 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.938296 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.958574 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.978444 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 27 11:11:37 crc kubenswrapper[4807]: I1127 11:11:37.998614 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.038078 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.059196 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.079149 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.100040 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.118946 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.139653 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.159991 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.179394 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.205096 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.219123 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.239106 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.265232 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.278943 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.299505 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.318018 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.338585 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.358897 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.379943 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.399914 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.419514 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.439379 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.466161 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.479120 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.498180 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.519044 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.538995 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.559524 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.578777 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.598943 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.618569 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.639176 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.658734 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.679014 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.699095 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.718945 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.736794 4807 request.go:700] Waited for 1.013035848s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.738143 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.778702 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.798958 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.807990 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-router-certs: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.808037 4807 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.808057 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-router-certs podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.308035216 +0000 UTC m=+140.407533414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-router-certs" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-router-certs") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.808093 4807 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.808105 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-serving-cert podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.308082677 +0000 UTC m=+140.407580955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-serving-cert") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.808138 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-policies podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.308123258 +0000 UTC m=+140.407621556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-policies") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809291 4807 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809358 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-config podName:a0a86502-013f-4060-91d7-cf5cd9353ccf nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309320674 +0000 UTC m=+140.408818952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-config") pod "machine-approver-56656f9798-lqp4t" (UID: "a0a86502-013f-4060-91d7-cf5cd9353ccf") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809367 4807 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809462 4807 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809399 4807 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809400 4807 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809540 4807 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809399 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-ocp-branding-template: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809465 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-config podName:e38d5152-81eb-46c2-9753-84286838528f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309446228 +0000 UTC m=+140.408944456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-config") pod "controller-manager-879f6c89f-4kzvt" (UID: "e38d5152-81eb-46c2-9753-84286838528f") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809587 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-config podName:d0fee666-2d95-4330-a8aa-4ab1ca30bb5f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309573621 +0000 UTC m=+140.409071909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-config") pod "machine-api-operator-5694c8668f-pvv9r" (UID: "d0fee666-2d95-4330-a8aa-4ab1ca30bb5f") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809603 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-trusted-ca-bundle podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309595332 +0000 UTC m=+140.409093640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-trusted-ca-bundle") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809608 4807 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809616 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-auth-proxy-config podName:a0a86502-013f-4060-91d7-cf5cd9353ccf nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309610133 +0000 UTC m=+140.409108431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-auth-proxy-config") pod "machine-approver-56656f9798-lqp4t" (UID: "a0a86502-013f-4060-91d7-cf5cd9353ccf") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809630 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-client podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309623793 +0000 UTC m=+140.409122101 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-client") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809654 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309646604 +0000 UTC m=+140.409144892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-ocp-branding-template" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809670 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-serving-cert podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309663754 +0000 UTC m=+140.409162052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-serving-cert") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809664 4807 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809720 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-images podName:d0fee666-2d95-4330-a8aa-4ab1ca30bb5f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309711456 +0000 UTC m=+140.409209644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-images") pod "machine-api-operator-5694c8668f-pvv9r" (UID: "d0fee666-2d95-4330-a8aa-4ab1ca30bb5f") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809745 4807 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809789 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-serving-ca podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309781838 +0000 UTC m=+140.409280036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-serving-ca") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809814 4807 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809873 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.3098646 +0000 UTC m=+140.409362918 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809905 4807 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809935 4807 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809964 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-trusted-ca-bundle podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309952773 +0000 UTC m=+140.409451091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-trusted-ca-bundle") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.809981 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-serving-ca podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.309973813 +0000 UTC m=+140.409472121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-serving-ca") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810266 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810287 4807 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810330 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310314174 +0000 UTC m=+140.409812402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-serving-cert" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810334 4807 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810359 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-encryption-config podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310343594 +0000 UTC m=+140.409841832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-encryption-config") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810375 4807 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810383 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e38d5152-81eb-46c2-9753-84286838528f-serving-cert podName:e38d5152-81eb-46c2-9753-84286838528f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310372135 +0000 UTC m=+140.409870373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e38d5152-81eb-46c2-9753-84286838528f-serving-cert") pod "controller-manager-879f6c89f-4kzvt" (UID: "e38d5152-81eb-46c2-9753-84286838528f") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810407 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles podName:e38d5152-81eb-46c2-9753-84286838528f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310396416 +0000 UTC m=+140.409894704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles") pod "controller-manager-879f6c89f-4kzvt" (UID: "e38d5152-81eb-46c2-9753-84286838528f") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810428 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-idp-0-file-data: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810443 4807 configmap.go:193] Couldn't get configMap openshift-authentication/audit: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810459 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310451388 +0000 UTC m=+140.409949716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-idp-0-file-data" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810481 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310470348 +0000 UTC m=+140.409968576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810499 4807 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810516 4807 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810527 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.31051848 +0000 UTC m=+140.410016798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810553 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca podName:e38d5152-81eb-46c2-9753-84286838528f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.31054257 +0000 UTC m=+140.410040798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca") pod "controller-manager-879f6c89f-4kzvt" (UID: "e38d5152-81eb-46c2-9753-84286838528f") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810555 4807 configmap.go:193] Couldn't get configMap openshift-image-registry/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810581 4807 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810590 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/168d38aa-a194-4c8e-9717-62f2d5fca760-trusted-ca podName:168d38aa-a194-4c8e-9717-62f2d5fca760 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310581121 +0000 UTC m=+140.410079449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/168d38aa-a194-4c8e-9717-62f2d5fca760-trusted-ca") pod "cluster-image-registry-operator-dc59b4c8b-zjcm8" (UID: "168d38aa-a194-4c8e-9717-62f2d5fca760") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810615 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810618 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-machine-api-operator-tls podName:d0fee666-2d95-4330-a8aa-4ab1ca30bb5f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310607002 +0000 UTC m=+140.410105240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-pvv9r" (UID: "d0fee666-2d95-4330-a8aa-4ab1ca30bb5f") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810655 4807 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810660 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-session podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310652024 +0000 UTC m=+140.410150322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-session") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810689 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-config podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310678774 +0000 UTC m=+140.410177012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-config") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810690 4807 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810718 4807 secret.go:188] Couldn't get secret openshift-image-registry/image-registry-operator-tls: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810723 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-audit podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310715365 +0000 UTC m=+140.410213693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-audit") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810768 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/168d38aa-a194-4c8e-9717-62f2d5fca760-image-registry-operator-tls podName:168d38aa-a194-4c8e-9717-62f2d5fca760 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310757537 +0000 UTC m=+140.410255775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/168d38aa-a194-4c8e-9717-62f2d5fca760-image-registry-operator-tls") pod "cluster-image-registry-operator-dc59b4c8b-zjcm8" (UID: "168d38aa-a194-4c8e-9717-62f2d5fca760") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810804 4807 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810880 4807 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810860 4807 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810924 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-client podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310911741 +0000 UTC m=+140.410409969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-client") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810954 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310941782 +0000 UTC m=+140.410440010 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810961 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-login: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810976 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-encryption-config podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310966063 +0000 UTC m=+140.410464301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-encryption-config") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.810982 4807 secret.go:188] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.811000 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-login podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.310989154 +0000 UTC m=+140.410487472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-login" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-login") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.811053 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0a86502-013f-4060-91d7-cf5cd9353ccf-machine-approver-tls podName:a0a86502-013f-4060-91d7-cf5cd9353ccf nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.311040605 +0000 UTC m=+140.410538903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/a0a86502-013f-4060-91d7-cf5cd9353ccf-machine-approver-tls") pod "machine-approver-56656f9798-lqp4t" (UID: "a0a86502-013f-4060-91d7-cf5cd9353ccf") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.811586 4807 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.811630 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-provider-selection: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.811639 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-image-import-ca podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.311622802 +0000 UTC m=+140.411121040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-image-import-ca") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.811653 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.811695 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-provider-selection podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.311660734 +0000 UTC m=+140.411159052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-provider-selection" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-provider-selection") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: E1127 11:11:38.811712 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:39.311705695 +0000 UTC m=+140.411203893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.818012 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.837998 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.858970 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.877879 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.897991 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.918358 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.946993 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.959164 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.979181 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 27 11:11:38 crc kubenswrapper[4807]: I1127 11:11:38.999351 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.019151 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.038904 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.059583 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.079605 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.099304 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.118654 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.138786 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.158546 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.178347 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.199311 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.219271 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.238843 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.259653 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.278186 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.299024 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.319059 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334028 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334131 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-config\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334187 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-client\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334223 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-encryption-config\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334309 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334349 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a0a86502-013f-4060-91d7-cf5cd9353ccf-machine-approver-tls\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334399 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334451 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334491 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-policies\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334526 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-serving-cert\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334560 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334632 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.334861 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-serving-cert\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.335034 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-auth-proxy-config\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.335105 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.335219 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.335338 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-client\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.335483 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-config\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.335598 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-config\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.335752 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.335912 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-config\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.335956 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-images\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336025 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-serving-ca\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336155 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336217 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38d5152-81eb-46c2-9753-84286838528f-serving-cert\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336316 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336389 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-encryption-config\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336453 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336643 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-audit\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336774 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336825 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336862 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336896 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/168d38aa-a194-4c8e-9717-62f2d5fca760-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336931 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/168d38aa-a194-4c8e-9717-62f2d5fca760-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.336967 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.337081 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.337147 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-image-import-ca\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.337185 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.338463 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.358416 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.379342 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.398958 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.419350 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.465980 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhzm2\" (UniqueName: \"kubernetes.io/projected/b61281aa-db40-457c-b669-35a825b77716-kube-api-access-vhzm2\") pod \"dns-operator-744455d44c-w4ddg\" (UID: \"b61281aa-db40-457c-b669-35a825b77716\") " pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.479441 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.483750 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pm8v\" (UniqueName: \"kubernetes.io/projected/168d38aa-a194-4c8e-9717-62f2d5fca760-kube-api-access-7pm8v\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.539204 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.554592 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.558713 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.604271 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/168d38aa-a194-4c8e-9717-62f2d5fca760-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.654011 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcfmc\" (UniqueName: \"kubernetes.io/projected/065b7e1c-110e-4b40-bd1f-c507f94828be-kube-api-access-gcfmc\") pod \"console-operator-58897d9998-kp2d4\" (UID: \"065b7e1c-110e-4b40-bd1f-c507f94828be\") " pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.659014 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.679359 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.700361 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.718806 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.736972 4807 request.go:700] Waited for 1.926569934s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/serviceaccounts/machine-api-operator/token Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.799373 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.815382 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x225x\" (UniqueName: \"kubernetes.io/projected/3daa1b2f-7da1-475f-8807-299bcf8423ca-kube-api-access-x225x\") pod \"downloads-7954f5f757-jbssf\" (UID: \"3daa1b2f-7da1-475f-8807-299bcf8423ca\") " pod="openshift-console/downloads-7954f5f757-jbssf" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.822224 4807 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.838687 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.849091 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jbssf" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.863534 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.870755 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-w4ddg"] Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.873914 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49510dea-c289-487e-a43e-9c8c314afd82-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-68g9f\" (UID: \"49510dea-c289-487e-a43e-9c8c314afd82\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:39 crc kubenswrapper[4807]: W1127 11:11:39.884589 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb61281aa_db40_457c_b669_35a825b77716.slice/crio-ee4581c483f68fee740ab0270067f49b44cd01cc0ef668f52250e77f36f20dd0 WatchSource:0}: Error finding container ee4581c483f68fee740ab0270067f49b44cd01cc0ef668f52250e77f36f20dd0: Status 404 returned error can't find the container with id ee4581c483f68fee740ab0270067f49b44cd01cc0ef668f52250e77f36f20dd0 Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.893908 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58m6x\" (UniqueName: \"kubernetes.io/projected/985db6fe-6579-439b-8fc6-b65d265991d4-kube-api-access-58m6x\") pod \"openshift-apiserver-operator-796bbdcf4f-2fmnh\" (UID: \"985db6fe-6579-439b-8fc6-b65d265991d4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.907282 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.924431 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.928609 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.939087 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.945471 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.947788 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.964201 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.979135 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 27 11:11:39 crc kubenswrapper[4807]: I1127 11:11:39.999681 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.011723 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-audit\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.016693 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jbssf"] Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.041775 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.041897 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.047554 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.053577 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a0a86502-013f-4060-91d7-cf5cd9353ccf-machine-approver-tls\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.056920 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kp2d4"] Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.059463 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.081660 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.090113 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38d5152-81eb-46c2-9753-84286838528f-serving-cert\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.091358 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f"] Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.099147 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.108919 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-client\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:40 crc kubenswrapper[4807]: W1127 11:11:40.112177 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49510dea_c289_487e_a43e_9c8c314afd82.slice/crio-c401f455b7dec93477e475ad0b9cffe329c59d4e01874a3b152053f58b170d2a WatchSource:0}: Error finding container c401f455b7dec93477e475ad0b9cffe329c59d4e01874a3b152053f58b170d2a: Status 404 returned error can't find the container with id c401f455b7dec93477e475ad0b9cffe329c59d4e01874a3b152053f58b170d2a Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.120819 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.130092 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.135509 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh"] Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.138189 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 27 11:11:40 crc kubenswrapper[4807]: W1127 11:11:40.142371 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod985db6fe_6579_439b_8fc6_b65d265991d4.slice/crio-89aa1831361a96ac489b5f537a1937cf48fba2926ee2308cc749b17b47ead980 WatchSource:0}: Error finding container 89aa1831361a96ac489b5f537a1937cf48fba2926ee2308cc749b17b47ead980: Status 404 returned error can't find the container with id 89aa1831361a96ac489b5f537a1937cf48fba2926ee2308cc749b17b47ead980 Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.151158 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/168d38aa-a194-4c8e-9717-62f2d5fca760-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.158495 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.165027 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-config\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.179281 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.187405 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-config\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.198828 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.206162 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-config\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.218209 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.238055 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.251132 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.258863 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.264383 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvv9z\" (UniqueName: \"kubernetes.io/projected/7e163b55-77cb-4055-9d4d-bef2d98d0139-kube-api-access-tvv9z\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.265375 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jbssf" event={"ID":"3daa1b2f-7da1-475f-8807-299bcf8423ca","Type":"ContainerStarted","Data":"01c34a36ed332943ba23ef3ba846111f07917e1f2a45dad6aed02c1f7b9ba811"} Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.265414 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jbssf" event={"ID":"3daa1b2f-7da1-475f-8807-299bcf8423ca","Type":"ContainerStarted","Data":"a2859d5f3a326dacb7580b2edd601f5802723c39a22a306437383a8fd840b71a"} Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.265576 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jbssf" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.267269 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" event={"ID":"985db6fe-6579-439b-8fc6-b65d265991d4","Type":"ContainerStarted","Data":"94f4a87ce70a26afc96764a126703c6e75fc78361b971244d147fef479803b36"} Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.267298 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" event={"ID":"985db6fe-6579-439b-8fc6-b65d265991d4","Type":"ContainerStarted","Data":"89aa1831361a96ac489b5f537a1937cf48fba2926ee2308cc749b17b47ead980"} Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.268132 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" event={"ID":"49510dea-c289-487e-a43e-9c8c314afd82","Type":"ContainerStarted","Data":"c401f455b7dec93477e475ad0b9cffe329c59d4e01874a3b152053f58b170d2a"} Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.269575 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kp2d4" event={"ID":"065b7e1c-110e-4b40-bd1f-c507f94828be","Type":"ContainerStarted","Data":"57e3ea4090ec4d0b76096762000418b2b287e0c2eda9b573d09d9f0301401787"} Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.269599 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kp2d4" event={"ID":"065b7e1c-110e-4b40-bd1f-c507f94828be","Type":"ContainerStarted","Data":"f06c4cd69c16a510bf3ddc4bb8bb131caac850f98999c6375d853292af63f6cf"} Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.270139 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.270212 4807 patch_prober.go:28] interesting pod/downloads-7954f5f757-jbssf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.270239 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jbssf" podUID="3daa1b2f-7da1-475f-8807-299bcf8423ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.273299 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" event={"ID":"b61281aa-db40-457c-b669-35a825b77716","Type":"ContainerStarted","Data":"d1461dbd2fbed5800eb05779c366a6c5501523ca03020def936c04575bac821f"} Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.273324 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" event={"ID":"b61281aa-db40-457c-b669-35a825b77716","Type":"ContainerStarted","Data":"ee4581c483f68fee740ab0270067f49b44cd01cc0ef668f52250e77f36f20dd0"} Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.273797 4807 patch_prober.go:28] interesting pod/console-operator-58897d9998-kp2d4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.273823 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kp2d4" podUID="065b7e1c-110e-4b40-bd1f-c507f94828be" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.278441 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.299354 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.318110 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.328765 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.334447 4807 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.334534 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-encryption-config podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.334510328 +0000 UTC m=+142.434008536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-encryption-config") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.334664 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.334707 4807 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.334718 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.334702943 +0000 UTC m=+142.434201261 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.334783 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-serving-cert podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.334764695 +0000 UTC m=+142.434262993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-serving-cert") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.334865 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-ocp-branding-template: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.334923 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.334905769 +0000 UTC m=+142.434404047 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-ocp-branding-template" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.334955 4807 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.334993 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-policies podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.334983652 +0000 UTC m=+142.434481970 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-policies") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.335018 4807 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.335044 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-serving-cert podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.335036503 +0000 UTC m=+142.434534821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-serving-cert") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.335242 4807 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.335308 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-auth-proxy-config podName:a0a86502-013f-4060-91d7-cf5cd9353ccf nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.335295981 +0000 UTC m=+142.434794189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-auth-proxy-config") pod "machine-approver-56656f9798-lqp4t" (UID: "a0a86502-013f-4060-91d7-cf5cd9353ccf") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.335385 4807 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.335438 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.335427395 +0000 UTC m=+142.434925683 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.335530 4807 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.335574 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-client podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.335563029 +0000 UTC m=+142.435061337 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-client") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336066 4807 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336092 4807 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336118 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-trusted-ca-bundle podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.336108445 +0000 UTC m=+142.435606653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-trusted-ca-bundle") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336138 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-config podName:d0fee666-2d95-4330-a8aa-4ab1ca30bb5f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.336129996 +0000 UTC m=+142.435628214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-config") pod "machine-api-operator-5694c8668f-pvv9r" (UID: "d0fee666-2d95-4330-a8aa-4ab1ca30bb5f") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336158 4807 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336195 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-images podName:d0fee666-2d95-4330-a8aa-4ab1ca30bb5f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.336184127 +0000 UTC m=+142.435682415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-images") pod "machine-api-operator-5694c8668f-pvv9r" (UID: "d0fee666-2d95-4330-a8aa-4ab1ca30bb5f") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336196 4807 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336269 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-serving-ca podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.336235149 +0000 UTC m=+142.435733447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-serving-ca") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336279 4807 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336320 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-serving-ca podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.336309501 +0000 UTC m=+142.435807799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-serving-ca") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336478 4807 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336518 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles podName:e38d5152-81eb-46c2-9753-84286838528f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.336509827 +0000 UTC m=+142.436008155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles") pod "controller-manager-879f6c89f-4kzvt" (UID: "e38d5152-81eb-46c2-9753-84286838528f") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336594 4807 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336640 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-encryption-config podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.336629721 +0000 UTC m=+142.436128049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-encryption-config") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336666 4807 configmap.go:193] Couldn't get configMap openshift-authentication/audit: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336699 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.336691403 +0000 UTC m=+142.436189621 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.336968 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-system-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337000 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.336991791 +0000 UTC m=+142.436489999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-serving-cert" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337035 4807 configmap.go:193] Couldn't get configMap openshift-image-registry/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337056 4807 secret.go:188] Couldn't get secret openshift-authentication/v4-0-config-user-idp-0-file-data: failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337062 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/168d38aa-a194-4c8e-9717-62f2d5fca760-trusted-ca podName:168d38aa-a194-4c8e-9717-62f2d5fca760 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.337053853 +0000 UTC m=+142.436552061 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/168d38aa-a194-4c8e-9717-62f2d5fca760-trusted-ca") pod "cluster-image-registry-operator-dc59b4c8b-zjcm8" (UID: "168d38aa-a194-4c8e-9717-62f2d5fca760") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337101 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.337093375 +0000 UTC m=+142.436591573 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-user-idp-0-file-data" (UniqueName: "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync secret cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337154 4807 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337160 4807 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337201 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca podName:e38d5152-81eb-46c2-9753-84286838528f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.337193497 +0000 UTC m=+142.436691705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca") pod "controller-manager-879f6c89f-4kzvt" (UID: "e38d5152-81eb-46c2-9753-84286838528f") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337224 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.337216748 +0000 UTC m=+142.436714966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337286 4807 configmap.go:193] Couldn't get configMap openshift-apiserver/image-import-ca: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337300 4807 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337314 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-image-import-ca podName:7e163b55-77cb-4055-9d4d-bef2d98d0139 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.337305791 +0000 UTC m=+142.436803999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-import-ca" (UniqueName: "kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-image-import-ca") pod "apiserver-76f77b778f-wk6hj" (UID: "7e163b55-77cb-4055-9d4d-bef2d98d0139") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.337335 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.337326121 +0000 UTC m=+142.436824319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.337984 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.378330 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.398413 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.418393 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.432012 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zjcm\" (UniqueName: \"kubernetes.io/projected/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-kube-api-access-5zjcm\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.443227 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452263 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-service-ca\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452311 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-trusted-ca\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452329 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c1dab92-eafe-4867-a625-c1e7404c7cf0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vnpgt\" (UID: \"8c1dab92-eafe-4867-a625-c1e7404c7cf0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452345 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwq8n\" (UniqueName: \"kubernetes.io/projected/b8224b35-5290-4943-987c-3101d802b811-kube-api-access-gwq8n\") pod \"machine-config-controller-84d6567774-ddd75\" (UID: \"b8224b35-5290-4943-987c-3101d802b811\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452448 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-serving-cert\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452485 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-service-ca-bundle\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452507 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnh2m\" (UniqueName: \"kubernetes.io/projected/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-kube-api-access-bnh2m\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452527 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0a7ac40a-0ecf-482a-9353-e6c71787da7e-stats-auth\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452550 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-config\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452604 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035df5f8-99c1-4c5c-bb47-87bd5057313e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gffzl\" (UID: \"035df5f8-99c1-4c5c-bb47-87bd5057313e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452666 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-bound-sa-token\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452696 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9pwb\" (UID: \"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452720 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-serving-cert\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452742 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4edf81da-33cf-4824-9bac-4926f59ccea1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-n7cfz\" (UID: \"4edf81da-33cf-4824-9bac-4926f59ccea1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452756 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-trusted-ca-bundle\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.452785 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-trusted-ca\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453009 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a14c13d8-b1d3-40c3-84c5-13f3e2060948-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gh6dt\" (UID: \"a14c13d8-b1d3-40c3-84c5-13f3e2060948\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453167 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdql\" (UniqueName: \"kubernetes.io/projected/8a49f46d-d1b2-4a71-b51f-be562df1fb70-kube-api-access-2jdql\") pod \"catalog-operator-68c6474976-95lnp\" (UID: \"8a49f46d-d1b2-4a71-b51f-be562df1fb70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453196 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-client-ca\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453213 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4edf81da-33cf-4824-9bac-4926f59ccea1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-n7cfz\" (UID: \"4edf81da-33cf-4824-9bac-4926f59ccea1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453282 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl6sp\" (UniqueName: \"kubernetes.io/projected/93c49e07-08ef-4b31-abb3-787a46a3fbfd-kube-api-access-gl6sp\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453314 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8224b35-5290-4943-987c-3101d802b811-proxy-tls\") pod \"machine-config-controller-84d6567774-ddd75\" (UID: \"b8224b35-5290-4943-987c-3101d802b811\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453352 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a7ac40a-0ecf-482a-9353-e6c71787da7e-metrics-certs\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453394 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/30c4595c-6d61-4e1a-a92a-db32926bae0b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lhr72\" (UID: \"30c4595c-6d61-4e1a-a92a-db32926bae0b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453412 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edf81da-33cf-4824-9bac-4926f59ccea1-config\") pod \"kube-apiserver-operator-766d6c64bb-n7cfz\" (UID: \"4edf81da-33cf-4824-9bac-4926f59ccea1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453437 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-config\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453455 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0a7ac40a-0ecf-482a-9353-e6c71787da7e-default-certificate\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453473 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-tls\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453492 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-config\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453514 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24lhb\" (UniqueName: \"kubernetes.io/projected/8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14-kube-api-access-24lhb\") pod \"openshift-config-operator-7777fb866f-p9pwb\" (UID: \"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453543 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-serving-cert\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453557 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-oauth-config\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453574 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9trp\" (UniqueName: \"kubernetes.io/projected/edf54281-ae68-49ee-ad29-b47f066f43df-kube-api-access-w9trp\") pod \"migrator-59844c95c7-ldgcp\" (UID: \"edf54281-ae68-49ee-ad29-b47f066f43df\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453599 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a7ac40a-0ecf-482a-9353-e6c71787da7e-service-ca-bundle\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453671 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c4r6\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-kube-api-access-2c4r6\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453689 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnw5w\" (UniqueName: \"kubernetes.io/projected/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-kube-api-access-vnw5w\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453707 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-etcd-client\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.453748 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5wkv\" (UniqueName: \"kubernetes.io/projected/41151abf-171c-435c-ae8c-172e6c55ba6c-kube-api-access-c5wkv\") pod \"cluster-samples-operator-665b6dd947-zk8cm\" (UID: \"41151abf-171c-435c-ae8c-172e6c55ba6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454026 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q48tw\" (UniqueName: \"kubernetes.io/projected/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-kube-api-access-q48tw\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454080 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/35c0378e-2da0-4e94-8230-2db66a4c7993-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454096 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454113 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14c13d8-b1d3-40c3-84c5-13f3e2060948-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gh6dt\" (UID: \"a14c13d8-b1d3-40c3-84c5-13f3e2060948\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454183 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f4x5\" (UniqueName: \"kubernetes.io/projected/0a7ac40a-0ecf-482a-9353-e6c71787da7e-kube-api-access-5f4x5\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454282 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8a49f46d-d1b2-4a71-b51f-be562df1fb70-srv-cert\") pod \"catalog-operator-68c6474976-95lnp\" (UID: \"8a49f46d-d1b2-4a71-b51f-be562df1fb70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454309 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-oauth-serving-cert\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454341 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c1dab92-eafe-4867-a625-c1e7404c7cf0-srv-cert\") pod \"olm-operator-6b444d44fb-vnpgt\" (UID: \"8c1dab92-eafe-4867-a625-c1e7404c7cf0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454379 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454607 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-config\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454625 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.454680 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:40.954666002 +0000 UTC m=+142.054164310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454976 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt8gd\" (UniqueName: \"kubernetes.io/projected/035df5f8-99c1-4c5c-bb47-87bd5057313e-kube-api-access-vt8gd\") pod \"openshift-controller-manager-operator-756b6f6bc6-gffzl\" (UID: \"035df5f8-99c1-4c5c-bb47-87bd5057313e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.454992 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41151abf-171c-435c-ae8c-172e6c55ba6c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zk8cm\" (UID: \"41151abf-171c-435c-ae8c-172e6c55ba6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455032 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/035df5f8-99c1-4c5c-bb47-87bd5057313e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gffzl\" (UID: \"035df5f8-99c1-4c5c-bb47-87bd5057313e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455092 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-etcd-service-ca\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455133 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/35c0378e-2da0-4e94-8230-2db66a4c7993-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455150 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svsbs\" (UniqueName: \"kubernetes.io/projected/30c4595c-6d61-4e1a-a92a-db32926bae0b-kube-api-access-svsbs\") pod \"multus-admission-controller-857f4d67dd-lhr72\" (UID: \"30c4595c-6d61-4e1a-a92a-db32926bae0b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455190 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9pwb\" (UID: \"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455225 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txc8l\" (UniqueName: \"kubernetes.io/projected/8c1dab92-eafe-4867-a625-c1e7404c7cf0-kube-api-access-txc8l\") pod \"olm-operator-6b444d44fb-vnpgt\" (UID: \"8c1dab92-eafe-4867-a625-c1e7404c7cf0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455272 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14c13d8-b1d3-40c3-84c5-13f3e2060948-config\") pod \"kube-controller-manager-operator-78b949d7b-gh6dt\" (UID: \"a14c13d8-b1d3-40c3-84c5-13f3e2060948\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455287 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b8224b35-5290-4943-987c-3101d802b811-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ddd75\" (UID: \"b8224b35-5290-4943-987c-3101d802b811\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455304 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjj7h\" (UniqueName: \"kubernetes.io/projected/648ab531-aea8-438e-9a6a-f827594e98b4-kube-api-access-qjj7h\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455320 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-metrics-tls\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455420 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-etcd-ca\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455444 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/648ab531-aea8-438e-9a6a-f827594e98b4-serving-cert\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455464 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-certificates\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.455482 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8a49f46d-d1b2-4a71-b51f-be562df1fb70-profile-collector-cert\") pod \"catalog-operator-68c6474976-95lnp\" (UID: \"8a49f46d-d1b2-4a71-b51f-be562df1fb70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.474660 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.478735 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.505640 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.518586 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.539614 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.540933 4807 projected.go:288] Couldn't get configMap openshift-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556328 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556514 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e091aded-a48b-4eec-896c-22870c1f216d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556535 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg75z\" (UniqueName: \"kubernetes.io/projected/e091aded-a48b-4eec-896c-22870c1f216d-kube-api-access-zg75z\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556553 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-plugins-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556581 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24lhb\" (UniqueName: \"kubernetes.io/projected/8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14-kube-api-access-24lhb\") pod \"openshift-config-operator-7777fb866f-p9pwb\" (UID: \"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556603 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-serving-cert\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556618 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e091aded-a48b-4eec-896c-22870c1f216d-proxy-tls\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556641 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47wzn\" (UniqueName: \"kubernetes.io/projected/4000ce66-354d-4cac-9da2-16d78c31b056-kube-api-access-47wzn\") pod \"machine-config-server-88rr6\" (UID: \"4000ce66-354d-4cac-9da2-16d78c31b056\") " pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556663 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a7ac40a-0ecf-482a-9353-e6c71787da7e-service-ca-bundle\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556679 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-oauth-config\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556698 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a96dd38-5283-4cea-a3d4-623c6a5191a6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xrg5s\" (UID: \"8a96dd38-5283-4cea-a3d4-623c6a5191a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556716 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9trp\" (UniqueName: \"kubernetes.io/projected/edf54281-ae68-49ee-ad29-b47f066f43df-kube-api-access-w9trp\") pod \"migrator-59844c95c7-ldgcp\" (UID: \"edf54281-ae68-49ee-ad29-b47f066f43df\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556732 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-82ntv\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556767 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnw5w\" (UniqueName: \"kubernetes.io/projected/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-kube-api-access-vnw5w\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556782 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8gs\" (UniqueName: \"kubernetes.io/projected/acbc9004-f19e-419d-b609-2f9dda223b0d-kube-api-access-dq8gs\") pod \"collect-profiles-29404020-wx78n\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556796 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkx7t\" (UniqueName: \"kubernetes.io/projected/658b2bd5-04cf-410b-a749-e0e67246ac4c-kube-api-access-hkx7t\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556812 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c4r6\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-kube-api-access-2c4r6\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556833 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-etcd-client\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556849 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5wkv\" (UniqueName: \"kubernetes.io/projected/41151abf-171c-435c-ae8c-172e6c55ba6c-kube-api-access-c5wkv\") pod \"cluster-samples-operator-665b6dd947-zk8cm\" (UID: \"41151abf-171c-435c-ae8c-172e6c55ba6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556873 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbc9004-f19e-419d-b609-2f9dda223b0d-config-volume\") pod \"collect-profiles-29404020-wx78n\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556890 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q48tw\" (UniqueName: \"kubernetes.io/projected/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-kube-api-access-q48tw\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556905 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acbc9004-f19e-419d-b609-2f9dda223b0d-secret-volume\") pod \"collect-profiles-29404020-wx78n\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556932 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/35c0378e-2da0-4e94-8230-2db66a4c7993-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556949 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.556966 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14c13d8-b1d3-40c3-84c5-13f3e2060948-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gh6dt\" (UID: \"a14c13d8-b1d3-40c3-84c5-13f3e2060948\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557002 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f4x5\" (UniqueName: \"kubernetes.io/projected/0a7ac40a-0ecf-482a-9353-e6c71787da7e-kube-api-access-5f4x5\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557018 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eba22ccb-d6b0-4f36-8583-3a602716a832-metrics-tls\") pod \"dns-default-rqc9w\" (UID: \"eba22ccb-d6b0-4f36-8583-3a602716a832\") " pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557033 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgr6d\" (UniqueName: \"kubernetes.io/projected/eba22ccb-d6b0-4f36-8583-3a602716a832-kube-api-access-kgr6d\") pod \"dns-default-rqc9w\" (UID: \"eba22ccb-d6b0-4f36-8583-3a602716a832\") " pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557059 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4vwl\" (UniqueName: \"kubernetes.io/projected/8a96dd38-5283-4cea-a3d4-623c6a5191a6-kube-api-access-q4vwl\") pod \"control-plane-machine-set-operator-78cbb6b69f-xrg5s\" (UID: \"8a96dd38-5283-4cea-a3d4-623c6a5191a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557083 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8a49f46d-d1b2-4a71-b51f-be562df1fb70-srv-cert\") pod \"catalog-operator-68c6474976-95lnp\" (UID: \"8a49f46d-d1b2-4a71-b51f-be562df1fb70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557098 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1689f4c3-757a-4e7c-a8b5-80177a556a0c-serving-cert\") pod \"service-ca-operator-777779d784-gxsrj\" (UID: \"1689f4c3-757a-4e7c-a8b5-80177a556a0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557114 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-oauth-serving-cert\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557130 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6e0ca81-eb66-426e-b8b0-1bff40087694-webhook-cert\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557143 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-registration-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557161 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c1dab92-eafe-4867-a625-c1e7404c7cf0-srv-cert\") pod \"olm-operator-6b444d44fb-vnpgt\" (UID: \"8c1dab92-eafe-4867-a625-c1e7404c7cf0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557190 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt6qk\" (UniqueName: \"kubernetes.io/projected/45212442-181e-4e28-a388-e87c6dce9bac-kube-api-access-gt6qk\") pod \"ingress-canary-p9f4b\" (UID: \"45212442-181e-4e28-a388-e87c6dce9bac\") " pod="openshift-ingress-canary/ingress-canary-p9f4b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557207 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-config\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557223 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557239 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt8gd\" (UniqueName: \"kubernetes.io/projected/035df5f8-99c1-4c5c-bb47-87bd5057313e-kube-api-access-vt8gd\") pod \"openshift-controller-manager-operator-756b6f6bc6-gffzl\" (UID: \"035df5f8-99c1-4c5c-bb47-87bd5057313e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557271 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41151abf-171c-435c-ae8c-172e6c55ba6c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zk8cm\" (UID: \"41151abf-171c-435c-ae8c-172e6c55ba6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557292 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/035df5f8-99c1-4c5c-bb47-87bd5057313e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gffzl\" (UID: \"035df5f8-99c1-4c5c-bb47-87bd5057313e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557335 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-etcd-service-ca\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557351 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eeca1a6-a2a9-4c52-badc-a9d78405979c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9g2c\" (UID: \"5eeca1a6-a2a9-4c52-badc-a9d78405979c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557369 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/35c0378e-2da0-4e94-8230-2db66a4c7993-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557386 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svsbs\" (UniqueName: \"kubernetes.io/projected/30c4595c-6d61-4e1a-a92a-db32926bae0b-kube-api-access-svsbs\") pod \"multus-admission-controller-857f4d67dd-lhr72\" (UID: \"30c4595c-6d61-4e1a-a92a-db32926bae0b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557403 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9pwb\" (UID: \"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557418 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txc8l\" (UniqueName: \"kubernetes.io/projected/8c1dab92-eafe-4867-a625-c1e7404c7cf0-kube-api-access-txc8l\") pod \"olm-operator-6b444d44fb-vnpgt\" (UID: \"8c1dab92-eafe-4867-a625-c1e7404c7cf0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557431 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45212442-181e-4e28-a388-e87c6dce9bac-cert\") pod \"ingress-canary-p9f4b\" (UID: \"45212442-181e-4e28-a388-e87c6dce9bac\") " pod="openshift-ingress-canary/ingress-canary-p9f4b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557446 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qjn\" (UniqueName: \"kubernetes.io/projected/1689f4c3-757a-4e7c-a8b5-80177a556a0c-kube-api-access-m5qjn\") pod \"service-ca-operator-777779d784-gxsrj\" (UID: \"1689f4c3-757a-4e7c-a8b5-80177a556a0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557461 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e091aded-a48b-4eec-896c-22870c1f216d-images\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557479 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjj7h\" (UniqueName: \"kubernetes.io/projected/648ab531-aea8-438e-9a6a-f827594e98b4-kube-api-access-qjj7h\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557493 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14c13d8-b1d3-40c3-84c5-13f3e2060948-config\") pod \"kube-controller-manager-operator-78b949d7b-gh6dt\" (UID: \"a14c13d8-b1d3-40c3-84c5-13f3e2060948\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557510 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b8224b35-5290-4943-987c-3101d802b811-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ddd75\" (UID: \"b8224b35-5290-4943-987c-3101d802b811\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557526 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/648ab531-aea8-438e-9a6a-f827594e98b4-serving-cert\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557542 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-metrics-tls\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557559 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-etcd-ca\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557575 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-certificates\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557592 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4000ce66-354d-4cac-9da2-16d78c31b056-node-bootstrap-token\") pod \"machine-config-server-88rr6\" (UID: \"4000ce66-354d-4cac-9da2-16d78c31b056\") " pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557608 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-82ntv\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557625 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8a49f46d-d1b2-4a71-b51f-be562df1fb70-profile-collector-cert\") pod \"catalog-operator-68c6474976-95lnp\" (UID: \"8a49f46d-d1b2-4a71-b51f-be562df1fb70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557667 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-service-ca\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557690 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-trusted-ca\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557708 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c1dab92-eafe-4867-a625-c1e7404c7cf0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vnpgt\" (UID: \"8c1dab92-eafe-4867-a625-c1e7404c7cf0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557724 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwq8n\" (UniqueName: \"kubernetes.io/projected/b8224b35-5290-4943-987c-3101d802b811-kube-api-access-gwq8n\") pod \"machine-config-controller-84d6567774-ddd75\" (UID: \"b8224b35-5290-4943-987c-3101d802b811\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.557740 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6e0ca81-eb66-426e-b8b0-1bff40087694-apiservice-cert\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.557794 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.057769119 +0000 UTC m=+142.157267397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.558944 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.559265 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14c13d8-b1d3-40c3-84c5-13f3e2060948-config\") pod \"kube-controller-manager-operator-78b949d7b-gh6dt\" (UID: \"a14c13d8-b1d3-40c3-84c5-13f3e2060948\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.559786 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/35c0378e-2da0-4e94-8230-2db66a4c7993-ca-trust-extracted\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.560602 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-etcd-service-ca\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.560747 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-serving-cert\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.560787 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-service-ca-bundle\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.560814 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnh2m\" (UniqueName: \"kubernetes.io/projected/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-kube-api-access-bnh2m\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.560835 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0a7ac40a-0ecf-482a-9353-e6c71787da7e-stats-auth\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.560856 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-config\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.560897 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035df5f8-99c1-4c5c-bb47-87bd5057313e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gffzl\" (UID: \"035df5f8-99c1-4c5c-bb47-87bd5057313e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.560927 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j24jb\" (UniqueName: \"kubernetes.io/projected/8e8444dc-ad7b-41c1-b524-ebd0e453eb46-kube-api-access-j24jb\") pod \"package-server-manager-789f6589d5-2pt99\" (UID: \"8e8444dc-ad7b-41c1-b524-ebd0e453eb46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.560953 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-mountpoint-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.560987 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv2px\" (UniqueName: \"kubernetes.io/projected/bbba7218-b809-4e7d-ba6e-071ff7261ba2-kube-api-access-wv2px\") pod \"service-ca-9c57cc56f-rqbbk\" (UID: \"bbba7218-b809-4e7d-ba6e-071ff7261ba2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.561010 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-socket-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.561083 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-service-ca\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.561571 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-etcd-ca\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.562461 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-certificates\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.563093 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-config\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.563319 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/035df5f8-99c1-4c5c-bb47-87bd5057313e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gffzl\" (UID: \"035df5f8-99c1-4c5c-bb47-87bd5057313e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.564145 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-metrics-tls\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.565631 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8c1dab92-eafe-4867-a625-c1e7404c7cf0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-vnpgt\" (UID: \"8c1dab92-eafe-4867-a625-c1e7404c7cf0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.565996 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/8a49f46d-d1b2-4a71-b51f-be562df1fb70-profile-collector-cert\") pod \"catalog-operator-68c6474976-95lnp\" (UID: \"8a49f46d-d1b2-4a71-b51f-be562df1fb70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.566041 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b8224b35-5290-4943-987c-3101d802b811-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-ddd75\" (UID: \"b8224b35-5290-4943-987c-3101d802b811\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.566289 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14c13d8-b1d3-40c3-84c5-13f3e2060948-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gh6dt\" (UID: \"a14c13d8-b1d3-40c3-84c5-13f3e2060948\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.566521 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-oauth-config\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.566906 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035df5f8-99c1-4c5c-bb47-87bd5057313e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gffzl\" (UID: \"035df5f8-99c1-4c5c-bb47-87bd5057313e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567030 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eeca1a6-a2a9-4c52-badc-a9d78405979c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9g2c\" (UID: \"5eeca1a6-a2a9-4c52-badc-a9d78405979c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567090 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-bound-sa-token\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567136 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9pwb\" (UID: \"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567199 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-serving-cert\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567227 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bbba7218-b809-4e7d-ba6e-071ff7261ba2-signing-key\") pod \"service-ca-9c57cc56f-rqbbk\" (UID: \"bbba7218-b809-4e7d-ba6e-071ff7261ba2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567271 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eba22ccb-d6b0-4f36-8583-3a602716a832-config-volume\") pod \"dns-default-rqc9w\" (UID: \"eba22ccb-d6b0-4f36-8583-3a602716a832\") " pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567304 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd94s\" (UniqueName: \"kubernetes.io/projected/d6e0ca81-eb66-426e-b8b0-1bff40087694-kube-api-access-fd94s\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567477 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-trusted-ca\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567543 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-oauth-serving-cert\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567695 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9pwb\" (UID: \"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567705 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-service-ca-bundle\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567767 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4edf81da-33cf-4824-9bac-4926f59ccea1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-n7cfz\" (UID: \"4edf81da-33cf-4824-9bac-4926f59ccea1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567795 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-trusted-ca-bundle\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567826 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktp87\" (UniqueName: \"kubernetes.io/projected/65e5ef23-71ad-40ae-81bb-e94d9d298087-kube-api-access-ktp87\") pod \"marketplace-operator-79b997595-82ntv\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567891 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-trusted-ca\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.567980 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.568064 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-etcd-client\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.568595 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/648ab531-aea8-438e-9a6a-f827594e98b4-serving-cert\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.568770 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-config\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.568893 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a14c13d8-b1d3-40c3-84c5-13f3e2060948-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gh6dt\" (UID: \"a14c13d8-b1d3-40c3-84c5-13f3e2060948\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569040 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-trusted-ca-bundle\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569053 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-client-ca\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569118 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-csi-data-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569183 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgrz9\" (UniqueName: \"kubernetes.io/projected/5eeca1a6-a2a9-4c52-badc-a9d78405979c-kube-api-access-xgrz9\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9g2c\" (UID: \"5eeca1a6-a2a9-4c52-badc-a9d78405979c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569228 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdql\" (UniqueName: \"kubernetes.io/projected/8a49f46d-d1b2-4a71-b51f-be562df1fb70-kube-api-access-2jdql\") pod \"catalog-operator-68c6474976-95lnp\" (UID: \"8a49f46d-d1b2-4a71-b51f-be562df1fb70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569555 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4edf81da-33cf-4824-9bac-4926f59ccea1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-n7cfz\" (UID: \"4edf81da-33cf-4824-9bac-4926f59ccea1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569637 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl6sp\" (UniqueName: \"kubernetes.io/projected/93c49e07-08ef-4b31-abb3-787a46a3fbfd-kube-api-access-gl6sp\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569677 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bbba7218-b809-4e7d-ba6e-071ff7261ba2-signing-cabundle\") pod \"service-ca-9c57cc56f-rqbbk\" (UID: \"bbba7218-b809-4e7d-ba6e-071ff7261ba2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569737 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1689f4c3-757a-4e7c-a8b5-80177a556a0c-config\") pod \"service-ca-operator-777779d784-gxsrj\" (UID: \"1689f4c3-757a-4e7c-a8b5-80177a556a0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569773 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8224b35-5290-4943-987c-3101d802b811-proxy-tls\") pod \"machine-config-controller-84d6567774-ddd75\" (UID: \"b8224b35-5290-4943-987c-3101d802b811\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569831 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a7ac40a-0ecf-482a-9353-e6c71787da7e-metrics-certs\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569867 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/30c4595c-6d61-4e1a-a92a-db32926bae0b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lhr72\" (UID: \"30c4595c-6d61-4e1a-a92a-db32926bae0b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569887 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-client-ca\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569900 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edf81da-33cf-4824-9bac-4926f59ccea1-config\") pod \"kube-apiserver-operator-766d6c64bb-n7cfz\" (UID: \"4edf81da-33cf-4824-9bac-4926f59ccea1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.569938 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0a7ac40a-0ecf-482a-9353-e6c71787da7e-default-certificate\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.570043 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d6e0ca81-eb66-426e-b8b0-1bff40087694-tmpfs\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.570086 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-config\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.570115 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-tls\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.570138 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4000ce66-354d-4cac-9da2-16d78c31b056-certs\") pod \"machine-config-server-88rr6\" (UID: \"4000ce66-354d-4cac-9da2-16d78c31b056\") " pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.570186 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e8444dc-ad7b-41c1-b524-ebd0e453eb46-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2pt99\" (UID: \"8e8444dc-ad7b-41c1-b524-ebd0e453eb46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.570303 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-config\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.570440 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-serving-cert\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.571155 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a7ac40a-0ecf-482a-9353-e6c71787da7e-service-ca-bundle\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.571468 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8a49f46d-d1b2-4a71-b51f-be562df1fb70-srv-cert\") pod \"catalog-operator-68c6474976-95lnp\" (UID: \"8a49f46d-d1b2-4a71-b51f-be562df1fb70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.572033 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-config\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.572200 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-config\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.572332 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41151abf-171c-435c-ae8c-172e6c55ba6c-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zk8cm\" (UID: \"41151abf-171c-435c-ae8c-172e6c55ba6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.572813 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-serving-cert\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.572946 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b8224b35-5290-4943-987c-3101d802b811-proxy-tls\") pod \"machine-config-controller-84d6567774-ddd75\" (UID: \"b8224b35-5290-4943-987c-3101d802b811\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.573468 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edf81da-33cf-4824-9bac-4926f59ccea1-config\") pod \"kube-apiserver-operator-766d6c64bb-n7cfz\" (UID: \"4edf81da-33cf-4824-9bac-4926f59ccea1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.573669 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/30c4595c-6d61-4e1a-a92a-db32926bae0b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lhr72\" (UID: \"30c4595c-6d61-4e1a-a92a-db32926bae0b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.574845 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0a7ac40a-0ecf-482a-9353-e6c71787da7e-stats-auth\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.575235 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0a7ac40a-0ecf-482a-9353-e6c71787da7e-default-certificate\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.575651 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a7ac40a-0ecf-482a-9353-e6c71787da7e-metrics-certs\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.575822 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9pwb\" (UID: \"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.576681 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4edf81da-33cf-4824-9bac-4926f59ccea1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-n7cfz\" (UID: \"4edf81da-33cf-4824-9bac-4926f59ccea1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.577034 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-tls\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.577490 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/35c0378e-2da0-4e94-8230-2db66a4c7993-installation-pull-secrets\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.578366 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-serving-cert\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.579116 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.581099 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/8c1dab92-eafe-4867-a625-c1e7404c7cf0-srv-cert\") pod \"olm-operator-6b444d44fb-vnpgt\" (UID: \"8c1dab92-eafe-4867-a625-c1e7404c7cf0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.598506 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.618385 4807 projected.go:288] Couldn't get configMap openshift-authentication/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.618451 4807 projected.go:194] Error preparing data for projected volume kube-api-access-z4c84 for pod openshift-authentication/oauth-openshift-558db77b4-qsdql: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.618519 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7462250b-699f-4fff-9600-8dff49efc2e8-kube-api-access-z4c84 podName:7462250b-699f-4fff-9600-8dff49efc2e8 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.118497886 +0000 UTC m=+142.217996084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z4c84" (UniqueName: "kubernetes.io/projected/7462250b-699f-4fff-9600-8dff49efc2e8-kube-api-access-z4c84") pod "oauth-openshift-558db77b4-qsdql" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.630287 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.635740 4807 projected.go:288] Couldn't get configMap openshift-oauth-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.638365 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.659884 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672010 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt6qk\" (UniqueName: \"kubernetes.io/projected/45212442-181e-4e28-a388-e87c6dce9bac-kube-api-access-gt6qk\") pod \"ingress-canary-p9f4b\" (UID: \"45212442-181e-4e28-a388-e87c6dce9bac\") " pod="openshift-ingress-canary/ingress-canary-p9f4b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672092 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eeca1a6-a2a9-4c52-badc-a9d78405979c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9g2c\" (UID: \"5eeca1a6-a2a9-4c52-badc-a9d78405979c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672146 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45212442-181e-4e28-a388-e87c6dce9bac-cert\") pod \"ingress-canary-p9f4b\" (UID: \"45212442-181e-4e28-a388-e87c6dce9bac\") " pod="openshift-ingress-canary/ingress-canary-p9f4b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672170 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qjn\" (UniqueName: \"kubernetes.io/projected/1689f4c3-757a-4e7c-a8b5-80177a556a0c-kube-api-access-m5qjn\") pod \"service-ca-operator-777779d784-gxsrj\" (UID: \"1689f4c3-757a-4e7c-a8b5-80177a556a0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672194 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e091aded-a48b-4eec-896c-22870c1f216d-images\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672227 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4000ce66-354d-4cac-9da2-16d78c31b056-node-bootstrap-token\") pod \"machine-config-server-88rr6\" (UID: \"4000ce66-354d-4cac-9da2-16d78c31b056\") " pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672272 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-82ntv\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672336 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6e0ca81-eb66-426e-b8b0-1bff40087694-apiservice-cert\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672492 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j24jb\" (UniqueName: \"kubernetes.io/projected/8e8444dc-ad7b-41c1-b524-ebd0e453eb46-kube-api-access-j24jb\") pod \"package-server-manager-789f6589d5-2pt99\" (UID: \"8e8444dc-ad7b-41c1-b524-ebd0e453eb46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672517 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-mountpoint-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672555 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv2px\" (UniqueName: \"kubernetes.io/projected/bbba7218-b809-4e7d-ba6e-071ff7261ba2-kube-api-access-wv2px\") pod \"service-ca-9c57cc56f-rqbbk\" (UID: \"bbba7218-b809-4e7d-ba6e-071ff7261ba2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672578 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-socket-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672610 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eeca1a6-a2a9-4c52-badc-a9d78405979c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9g2c\" (UID: \"5eeca1a6-a2a9-4c52-badc-a9d78405979c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672648 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd94s\" (UniqueName: \"kubernetes.io/projected/d6e0ca81-eb66-426e-b8b0-1bff40087694-kube-api-access-fd94s\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672702 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bbba7218-b809-4e7d-ba6e-071ff7261ba2-signing-key\") pod \"service-ca-9c57cc56f-rqbbk\" (UID: \"bbba7218-b809-4e7d-ba6e-071ff7261ba2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672723 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eba22ccb-d6b0-4f36-8583-3a602716a832-config-volume\") pod \"dns-default-rqc9w\" (UID: \"eba22ccb-d6b0-4f36-8583-3a602716a832\") " pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672776 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktp87\" (UniqueName: \"kubernetes.io/projected/65e5ef23-71ad-40ae-81bb-e94d9d298087-kube-api-access-ktp87\") pod \"marketplace-operator-79b997595-82ntv\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672827 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-csi-data-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672850 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrz9\" (UniqueName: \"kubernetes.io/projected/5eeca1a6-a2a9-4c52-badc-a9d78405979c-kube-api-access-xgrz9\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9g2c\" (UID: \"5eeca1a6-a2a9-4c52-badc-a9d78405979c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672883 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bbba7218-b809-4e7d-ba6e-071ff7261ba2-signing-cabundle\") pod \"service-ca-9c57cc56f-rqbbk\" (UID: \"bbba7218-b809-4e7d-ba6e-071ff7261ba2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672906 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1689f4c3-757a-4e7c-a8b5-80177a556a0c-config\") pod \"service-ca-operator-777779d784-gxsrj\" (UID: \"1689f4c3-757a-4e7c-a8b5-80177a556a0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672938 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d6e0ca81-eb66-426e-b8b0-1bff40087694-tmpfs\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672965 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e8444dc-ad7b-41c1-b524-ebd0e453eb46-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2pt99\" (UID: \"8e8444dc-ad7b-41c1-b524-ebd0e453eb46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.672988 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4000ce66-354d-4cac-9da2-16d78c31b056-certs\") pod \"machine-config-server-88rr6\" (UID: \"4000ce66-354d-4cac-9da2-16d78c31b056\") " pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673029 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e091aded-a48b-4eec-896c-22870c1f216d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673052 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg75z\" (UniqueName: \"kubernetes.io/projected/e091aded-a48b-4eec-896c-22870c1f216d-kube-api-access-zg75z\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673072 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-plugins-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673100 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e091aded-a48b-4eec-896c-22870c1f216d-images\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673114 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e091aded-a48b-4eec-896c-22870c1f216d-proxy-tls\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673140 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47wzn\" (UniqueName: \"kubernetes.io/projected/4000ce66-354d-4cac-9da2-16d78c31b056-kube-api-access-47wzn\") pod \"machine-config-server-88rr6\" (UID: \"4000ce66-354d-4cac-9da2-16d78c31b056\") " pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673189 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a96dd38-5283-4cea-a3d4-623c6a5191a6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xrg5s\" (UID: \"8a96dd38-5283-4cea-a3d4-623c6a5191a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673222 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-82ntv\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673297 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8gs\" (UniqueName: \"kubernetes.io/projected/acbc9004-f19e-419d-b609-2f9dda223b0d-kube-api-access-dq8gs\") pod \"collect-profiles-29404020-wx78n\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673322 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkx7t\" (UniqueName: \"kubernetes.io/projected/658b2bd5-04cf-410b-a749-e0e67246ac4c-kube-api-access-hkx7t\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673373 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbc9004-f19e-419d-b609-2f9dda223b0d-config-volume\") pod \"collect-profiles-29404020-wx78n\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673406 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acbc9004-f19e-419d-b609-2f9dda223b0d-secret-volume\") pod \"collect-profiles-29404020-wx78n\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673488 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eba22ccb-d6b0-4f36-8583-3a602716a832-metrics-tls\") pod \"dns-default-rqc9w\" (UID: \"eba22ccb-d6b0-4f36-8583-3a602716a832\") " pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673512 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgr6d\" (UniqueName: \"kubernetes.io/projected/eba22ccb-d6b0-4f36-8583-3a602716a832-kube-api-access-kgr6d\") pod \"dns-default-rqc9w\" (UID: \"eba22ccb-d6b0-4f36-8583-3a602716a832\") " pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673544 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4vwl\" (UniqueName: \"kubernetes.io/projected/8a96dd38-5283-4cea-a3d4-623c6a5191a6-kube-api-access-q4vwl\") pod \"control-plane-machine-set-operator-78cbb6b69f-xrg5s\" (UID: \"8a96dd38-5283-4cea-a3d4-623c6a5191a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673577 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1689f4c3-757a-4e7c-a8b5-80177a556a0c-serving-cert\") pod \"service-ca-operator-777779d784-gxsrj\" (UID: \"1689f4c3-757a-4e7c-a8b5-80177a556a0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673599 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6e0ca81-eb66-426e-b8b0-1bff40087694-webhook-cert\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673642 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-registration-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673680 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.673919 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-82ntv\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.674022 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.174007387 +0000 UTC m=+142.273505625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.674438 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-csi-data-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.674441 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-socket-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.674633 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-mountpoint-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.675162 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45212442-181e-4e28-a388-e87c6dce9bac-cert\") pod \"ingress-canary-p9f4b\" (UID: \"45212442-181e-4e28-a388-e87c6dce9bac\") " pod="openshift-ingress-canary/ingress-canary-p9f4b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.675216 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/4000ce66-354d-4cac-9da2-16d78c31b056-node-bootstrap-token\") pod \"machine-config-server-88rr6\" (UID: \"4000ce66-354d-4cac-9da2-16d78c31b056\") " pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.675274 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eeca1a6-a2a9-4c52-badc-a9d78405979c-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9g2c\" (UID: \"5eeca1a6-a2a9-4c52-badc-a9d78405979c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.675674 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5eeca1a6-a2a9-4c52-badc-a9d78405979c-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9g2c\" (UID: \"5eeca1a6-a2a9-4c52-badc-a9d78405979c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.676097 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/d6e0ca81-eb66-426e-b8b0-1bff40087694-tmpfs\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.676751 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1689f4c3-757a-4e7c-a8b5-80177a556a0c-config\") pod \"service-ca-operator-777779d784-gxsrj\" (UID: \"1689f4c3-757a-4e7c-a8b5-80177a556a0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.677445 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d6e0ca81-eb66-426e-b8b0-1bff40087694-apiservice-cert\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.677833 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-registration-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.677832 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bbba7218-b809-4e7d-ba6e-071ff7261ba2-signing-cabundle\") pod \"service-ca-9c57cc56f-rqbbk\" (UID: \"bbba7218-b809-4e7d-ba6e-071ff7261ba2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.677941 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e091aded-a48b-4eec-896c-22870c1f216d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.678103 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbc9004-f19e-419d-b609-2f9dda223b0d-config-volume\") pod \"collect-profiles-29404020-wx78n\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.678290 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/658b2bd5-04cf-410b-a749-e0e67246ac4c-plugins-dir\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.677930 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eba22ccb-d6b0-4f36-8583-3a602716a832-config-volume\") pod \"dns-default-rqc9w\" (UID: \"eba22ccb-d6b0-4f36-8583-3a602716a832\") " pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.679695 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.680095 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bbba7218-b809-4e7d-ba6e-071ff7261ba2-signing-key\") pod \"service-ca-9c57cc56f-rqbbk\" (UID: \"bbba7218-b809-4e7d-ba6e-071ff7261ba2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.680214 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e091aded-a48b-4eec-896c-22870c1f216d-proxy-tls\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.680274 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acbc9004-f19e-419d-b609-2f9dda223b0d-secret-volume\") pod \"collect-profiles-29404020-wx78n\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.680851 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8e8444dc-ad7b-41c1-b524-ebd0e453eb46-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2pt99\" (UID: \"8e8444dc-ad7b-41c1-b524-ebd0e453eb46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.681191 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d6e0ca81-eb66-426e-b8b0-1bff40087694-webhook-cert\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.682401 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-82ntv\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.682531 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1689f4c3-757a-4e7c-a8b5-80177a556a0c-serving-cert\") pod \"service-ca-operator-777779d784-gxsrj\" (UID: \"1689f4c3-757a-4e7c-a8b5-80177a556a0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.684767 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/4000ce66-354d-4cac-9da2-16d78c31b056-certs\") pod \"machine-config-server-88rr6\" (UID: \"4000ce66-354d-4cac-9da2-16d78c31b056\") " pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.685837 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a96dd38-5283-4cea-a3d4-623c6a5191a6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xrg5s\" (UID: \"8a96dd38-5283-4cea-a3d4-623c6a5191a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.688768 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eba22ccb-d6b0-4f36-8583-3a602716a832-metrics-tls\") pod \"dns-default-rqc9w\" (UID: \"eba22ccb-d6b0-4f36-8583-3a602716a832\") " pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.698643 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.719804 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.737397 4807 request.go:700] Waited for 1.726894154s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.739071 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.741049 4807 projected.go:194] Error preparing data for projected volume kube-api-access-2tk46 for pod openshift-controller-manager/controller-manager-879f6c89f-4kzvt: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.741119 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e38d5152-81eb-46c2-9753-84286838528f-kube-api-access-2tk46 podName:e38d5152-81eb-46c2-9753-84286838528f nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.241099103 +0000 UTC m=+142.340597301 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2tk46" (UniqueName: "kubernetes.io/projected/e38d5152-81eb-46c2-9753-84286838528f-kube-api-access-2tk46") pod "controller-manager-879f6c89f-4kzvt" (UID: "e38d5152-81eb-46c2-9753-84286838528f") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.759170 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.774333 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.775231 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.275211048 +0000 UTC m=+142.374709246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.778934 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.798654 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.803717 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5t6m\" (UniqueName: \"kubernetes.io/projected/a0a86502-013f-4060-91d7-cf5cd9353ccf-kube-api-access-l5t6m\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.820022 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.838983 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.859581 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.876154 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.876492 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.376476061 +0000 UTC m=+142.475974259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.878008 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.899445 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.906902 4807 projected.go:194] Error preparing data for projected volume kube-api-access-hftrd for pod openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz: failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.906991 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6711e08e-05e5-4111-9e1e-fd5ed988a718-kube-api-access-hftrd podName:6711e08e-05e5-4111-9e1e-fd5ed988a718 nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.406974098 +0000 UTC m=+142.506472296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hftrd" (UniqueName: "kubernetes.io/projected/6711e08e-05e5-4111-9e1e-fd5ed988a718-kube-api-access-hftrd") pod "apiserver-7bbb656c7d-hkmnz" (UID: "6711e08e-05e5-4111-9e1e-fd5ed988a718") : failed to sync configmap cache: timed out waiting for the condition Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.918854 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.938021 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.958405 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.976591 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:40 crc kubenswrapper[4807]: E1127 11:11:40.977212 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.477198247 +0000 UTC m=+142.576696445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.978739 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 27 11:11:40 crc kubenswrapper[4807]: I1127 11:11:40.999124 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.018151 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.038899 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.064224 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.069354 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-trusted-ca\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.077914 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.078372 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.578361107 +0000 UTC m=+142.677859305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.113697 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f4x5\" (UniqueName: \"kubernetes.io/projected/0a7ac40a-0ecf-482a-9353-e6c71787da7e-kube-api-access-5f4x5\") pod \"router-default-5444994796-7kpfm\" (UID: \"0a7ac40a-0ecf-482a-9353-e6c71787da7e\") " pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.132160 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24lhb\" (UniqueName: \"kubernetes.io/projected/8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14-kube-api-access-24lhb\") pod \"openshift-config-operator-7777fb866f-p9pwb\" (UID: \"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.153115 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9trp\" (UniqueName: \"kubernetes.io/projected/edf54281-ae68-49ee-ad29-b47f066f43df-kube-api-access-w9trp\") pod \"migrator-59844c95c7-ldgcp\" (UID: \"edf54281-ae68-49ee-ad29-b47f066f43df\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.173222 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnw5w\" (UniqueName: \"kubernetes.io/projected/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-kube-api-access-vnw5w\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.179944 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.180116 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.680088773 +0000 UTC m=+142.779586991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.180313 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4c84\" (UniqueName: \"kubernetes.io/projected/7462250b-699f-4fff-9600-8dff49efc2e8-kube-api-access-z4c84\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.180888 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.180917 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.181456 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.681440623 +0000 UTC m=+142.780938891 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.183916 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4c84\" (UniqueName: \"kubernetes.io/projected/7462250b-699f-4fff-9600-8dff49efc2e8-kube-api-access-z4c84\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.194034 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c4r6\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-kube-api-access-2c4r6\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:41 crc kubenswrapper[4807]: W1127 11:11:41.216490 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a7ac40a_0ecf_482a_9353_e6c71787da7e.slice/crio-abe91294361aa535ba0dbcd3ecedda88144e6080107373d5ee8516bd18dfe29f WatchSource:0}: Error finding container abe91294361aa535ba0dbcd3ecedda88144e6080107373d5ee8516bd18dfe29f: Status 404 returned error can't find the container with id abe91294361aa535ba0dbcd3ecedda88144e6080107373d5ee8516bd18dfe29f Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.226857 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.231347 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/85f17fc8-a7a8-4fdf-88d2-7f9471b61c79-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vmm8j\" (UID: \"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.235697 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwq8n\" (UniqueName: \"kubernetes.io/projected/b8224b35-5290-4943-987c-3101d802b811-kube-api-access-gwq8n\") pod \"machine-config-controller-84d6567774-ddd75\" (UID: \"b8224b35-5290-4943-987c-3101d802b811\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.255071 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5wkv\" (UniqueName: \"kubernetes.io/projected/41151abf-171c-435c-ae8c-172e6c55ba6c-kube-api-access-c5wkv\") pod \"cluster-samples-operator-665b6dd947-zk8cm\" (UID: \"41151abf-171c-435c-ae8c-172e6c55ba6c\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.279325 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" event={"ID":"b61281aa-db40-457c-b669-35a825b77716","Type":"ContainerStarted","Data":"262a53716468b1424b29fed84da58d447d49a99f5915e2ad9c2bbcc952bb3f19"} Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.281920 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.282181 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.782148299 +0000 UTC m=+142.881646517 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.282342 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tk46\" (UniqueName: \"kubernetes.io/projected/e38d5152-81eb-46c2-9753-84286838528f-kube-api-access-2tk46\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.282880 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7kpfm" event={"ID":"0a7ac40a-0ecf-482a-9353-e6c71787da7e","Type":"ContainerStarted","Data":"abe91294361aa535ba0dbcd3ecedda88144e6080107373d5ee8516bd18dfe29f"} Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.283377 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.283758 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.783742577 +0000 UTC m=+142.883240795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.285110 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" event={"ID":"49510dea-c289-487e-a43e-9c8c314afd82","Type":"ContainerStarted","Data":"36525fea9a07844423f6f270555cdebb50a411d21a1fae695e77b09fb29c7e30"} Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.285837 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tk46\" (UniqueName: \"kubernetes.io/projected/e38d5152-81eb-46c2-9753-84286838528f-kube-api-access-2tk46\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.286384 4807 patch_prober.go:28] interesting pod/downloads-7954f5f757-jbssf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.286594 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jbssf" podUID="3daa1b2f-7da1-475f-8807-299bcf8423ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.289872 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q48tw\" (UniqueName: \"kubernetes.io/projected/c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b-kube-api-access-q48tw\") pod \"etcd-operator-b45778765-9jvwr\" (UID: \"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.303796 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svsbs\" (UniqueName: \"kubernetes.io/projected/30c4595c-6d61-4e1a-a92a-db32926bae0b-kube-api-access-svsbs\") pod \"multus-admission-controller-857f4d67dd-lhr72\" (UID: \"30c4595c-6d61-4e1a-a92a-db32926bae0b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.317452 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnh2m\" (UniqueName: \"kubernetes.io/projected/5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d-kube-api-access-bnh2m\") pod \"authentication-operator-69f744f599-m2dvt\" (UID: \"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.332882 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txc8l\" (UniqueName: \"kubernetes.io/projected/8c1dab92-eafe-4867-a625-c1e7404c7cf0-kube-api-access-txc8l\") pod \"olm-operator-6b444d44fb-vnpgt\" (UID: \"8c1dab92-eafe-4867-a625-c1e7404c7cf0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.370576 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.371770 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjj7h\" (UniqueName: \"kubernetes.io/projected/648ab531-aea8-438e-9a6a-f827594e98b4-kube-api-access-qjj7h\") pod \"route-controller-manager-6576b87f9c-sxp7z\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.377512 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.377802 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-bound-sa-token\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.384636 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.384797 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.884773831 +0000 UTC m=+142.984272039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.384921 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.384974 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.385276 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.385303 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.385327 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.385367 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-encryption-config\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.385533 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.385568 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-policies\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.385586 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-serving-cert\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.386138 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.386262 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.386907 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.387847 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.387908 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.387986 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388036 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-config\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388060 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-images\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388082 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388125 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388162 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388210 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388217 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/168d38aa-a194-4c8e-9717-62f2d5fca760-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388258 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-image-import-ca\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388365 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-serving-cert\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388399 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-auth-proxy-config\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388638 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-encryption-config\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388967 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.388996 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-client\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.389078 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-serving-ca\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.389119 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.389122 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0a86502-013f-4060-91d7-cf5cd9353ccf-auth-proxy-config\") pod \"machine-approver-56656f9798-lqp4t\" (UID: \"a0a86502-013f-4060-91d7-cf5cd9353ccf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.389136 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-encryption-config\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.389736 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.389775 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.390285 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.890272375 +0000 UTC m=+142.989770573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.390871 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-image-import-ca\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.392862 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.392495 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.394746 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-config\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.395330 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4kzvt\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.396938 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-serving-cert\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.396934 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-encryption-config\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.400448 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-client\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.404373 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4edf81da-33cf-4824-9bac-4926f59ccea1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-n7cfz\" (UID: \"4edf81da-33cf-4824-9bac-4926f59ccea1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.404551 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-qsdql\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.418694 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.419680 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt8gd\" (UniqueName: \"kubernetes.io/projected/035df5f8-99c1-4c5c-bb47-87bd5057313e-kube-api-access-vt8gd\") pod \"openshift-controller-manager-operator-756b6f6bc6-gffzl\" (UID: \"035df5f8-99c1-4c5c-bb47-87bd5057313e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.435180 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a14c13d8-b1d3-40c3-84c5-13f3e2060948-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gh6dt\" (UID: \"a14c13d8-b1d3-40c3-84c5-13f3e2060948\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.440157 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.447368 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp"] Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.450389 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7e163b55-77cb-4055-9d4d-bef2d98d0139-etcd-serving-ca\") pod \"apiserver-76f77b778f-wk6hj\" (UID: \"7e163b55-77cb-4055-9d4d-bef2d98d0139\") " pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.450540 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/168d38aa-a194-4c8e-9717-62f2d5fca760-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zjcm8\" (UID: \"168d38aa-a194-4c8e-9717-62f2d5fca760\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.450847 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.450933 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.450933 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6711e08e-05e5-4111-9e1e-fd5ed988a718-serving-cert\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.451272 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d0fee666-2d95-4330-a8aa-4ab1ca30bb5f-images\") pod \"machine-api-operator-5694c8668f-pvv9r\" (UID: \"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.453119 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.455550 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6711e08e-05e5-4111-9e1e-fd5ed988a718-audit-policies\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.455795 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdql\" (UniqueName: \"kubernetes.io/projected/8a49f46d-d1b2-4a71-b51f-be562df1fb70-kube-api-access-2jdql\") pod \"catalog-operator-68c6474976-95lnp\" (UID: \"8a49f46d-d1b2-4a71-b51f-be562df1fb70\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.461536 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.467762 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" Nov 27 11:11:41 crc kubenswrapper[4807]: W1127 11:11:41.473122 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedf54281_ae68_49ee_ad29_b47f066f43df.slice/crio-773f76d1cd42ce80c85eacc16cd4387aa537edf8b2952bedcd119470fbeab611 WatchSource:0}: Error finding container 773f76d1cd42ce80c85eacc16cd4387aa537edf8b2952bedcd119470fbeab611: Status 404 returned error can't find the container with id 773f76d1cd42ce80c85eacc16cd4387aa537edf8b2952bedcd119470fbeab611 Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.473807 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.486787 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl6sp\" (UniqueName: \"kubernetes.io/projected/93c49e07-08ef-4b31-abb3-787a46a3fbfd-kube-api-access-gl6sp\") pod \"console-f9d7485db-jdsqc\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.488675 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.490726 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.491046 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hftrd\" (UniqueName: \"kubernetes.io/projected/6711e08e-05e5-4111-9e1e-fd5ed988a718-kube-api-access-hftrd\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.491186 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:41.991151646 +0000 UTC m=+143.090649844 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.492422 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.494457 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt6qk\" (UniqueName: \"kubernetes.io/projected/45212442-181e-4e28-a388-e87c6dce9bac-kube-api-access-gt6qk\") pod \"ingress-canary-p9f4b\" (UID: \"45212442-181e-4e28-a388-e87c6dce9bac\") " pod="openshift-ingress-canary/ingress-canary-p9f4b" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.495029 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.502056 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.504275 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.510741 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.513474 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qjn\" (UniqueName: \"kubernetes.io/projected/1689f4c3-757a-4e7c-a8b5-80177a556a0c-kube-api-access-m5qjn\") pod \"service-ca-operator-777779d784-gxsrj\" (UID: \"1689f4c3-757a-4e7c-a8b5-80177a556a0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.541633 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktp87\" (UniqueName: \"kubernetes.io/projected/65e5ef23-71ad-40ae-81bb-e94d9d298087-kube-api-access-ktp87\") pod \"marketplace-operator-79b997595-82ntv\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.544535 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.544969 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.553010 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.553671 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgrz9\" (UniqueName: \"kubernetes.io/projected/5eeca1a6-a2a9-4c52-badc-a9d78405979c-kube-api-access-xgrz9\") pod \"kube-storage-version-migrator-operator-b67b599dd-v9g2c\" (UID: \"5eeca1a6-a2a9-4c52-badc-a9d78405979c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.563096 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.567033 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.584214 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd94s\" (UniqueName: \"kubernetes.io/projected/d6e0ca81-eb66-426e-b8b0-1bff40087694-kube-api-access-fd94s\") pod \"packageserver-d55dfcdfc-72xkj\" (UID: \"d6e0ca81-eb66-426e-b8b0-1bff40087694\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.590209 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.591759 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.592138 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.09212616 +0000 UTC m=+143.191624348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.597162 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv2px\" (UniqueName: \"kubernetes.io/projected/bbba7218-b809-4e7d-ba6e-071ff7261ba2-kube-api-access-wv2px\") pod \"service-ca-9c57cc56f-rqbbk\" (UID: \"bbba7218-b809-4e7d-ba6e-071ff7261ba2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.605566 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.606030 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm"] Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.609878 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.613173 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgr6d\" (UniqueName: \"kubernetes.io/projected/eba22ccb-d6b0-4f36-8583-3a602716a832-kube-api-access-kgr6d\") pod \"dns-default-rqc9w\" (UID: \"eba22ccb-d6b0-4f36-8583-3a602716a832\") " pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.620943 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.621062 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.629056 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hftrd\" (UniqueName: \"kubernetes.io/projected/6711e08e-05e5-4111-9e1e-fd5ed988a718-kube-api-access-hftrd\") pod \"apiserver-7bbb656c7d-hkmnz\" (UID: \"6711e08e-05e5-4111-9e1e-fd5ed988a718\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.629325 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p9f4b" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.635814 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8gs\" (UniqueName: \"kubernetes.io/projected/acbc9004-f19e-419d-b609-2f9dda223b0d-kube-api-access-dq8gs\") pod \"collect-profiles-29404020-wx78n\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.648812 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-kp2d4" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.654873 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkx7t\" (UniqueName: \"kubernetes.io/projected/658b2bd5-04cf-410b-a749-e0e67246ac4c-kube-api-access-hkx7t\") pod \"csi-hostpathplugin-jvchs\" (UID: \"658b2bd5-04cf-410b-a749-e0e67246ac4c\") " pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.672938 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j24jb\" (UniqueName: \"kubernetes.io/projected/8e8444dc-ad7b-41c1-b524-ebd0e453eb46-kube-api-access-j24jb\") pod \"package-server-manager-789f6589d5-2pt99\" (UID: \"8e8444dc-ad7b-41c1-b524-ebd0e453eb46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.694848 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.695338 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.195305549 +0000 UTC m=+143.294803747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.695777 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4vwl\" (UniqueName: \"kubernetes.io/projected/8a96dd38-5283-4cea-a3d4-623c6a5191a6-kube-api-access-q4vwl\") pod \"control-plane-machine-set-operator-78cbb6b69f-xrg5s\" (UID: \"8a96dd38-5283-4cea-a3d4-623c6a5191a6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.695908 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.696203 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.196192626 +0000 UTC m=+143.295690824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.709307 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb"] Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.714983 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47wzn\" (UniqueName: \"kubernetes.io/projected/4000ce66-354d-4cac-9da2-16d78c31b056-kube-api-access-47wzn\") pod \"machine-config-server-88rr6\" (UID: \"4000ce66-354d-4cac-9da2-16d78c31b056\") " pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.746867 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg75z\" (UniqueName: \"kubernetes.io/projected/e091aded-a48b-4eec-896c-22870c1f216d-kube-api-access-zg75z\") pod \"machine-config-operator-74547568cd-bst7b\" (UID: \"e091aded-a48b-4eec-896c-22870c1f216d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.796805 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.797215 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.297198991 +0000 UTC m=+143.396697189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.816552 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz"] Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.819637 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9jvwr"] Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.834514 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.837146 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.867909 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.870737 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z"] Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.872291 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-m2dvt"] Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.878469 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.883927 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.898382 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.898708 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" Nov 27 11:11:41 crc kubenswrapper[4807]: E1127 11:11:41.898832 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.398820414 +0000 UTC m=+143.498318612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.912320 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-88rr6" Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.933885 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jdsqc"] Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.945279 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jvchs" Nov 27 11:11:41 crc kubenswrapper[4807]: W1127 11:11:41.968195 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ef37ae7_806a_4c1b_b8b1_a4eefbb26f14.slice/crio-314c4106a406c3c9c008edbccdaab3c274b45058e338efa287c7c7e1887f4bec WatchSource:0}: Error finding container 314c4106a406c3c9c008edbccdaab3c274b45058e338efa287c7c7e1887f4bec: Status 404 returned error can't find the container with id 314c4106a406c3c9c008edbccdaab3c274b45058e338efa287c7c7e1887f4bec Nov 27 11:11:41 crc kubenswrapper[4807]: W1127 11:11:41.983343 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4edf81da_33cf_4824_9bac_4926f59ccea1.slice/crio-280a543f592dba929991c837d9a0aeeef869603496ab2af7185e682e8276e42d WatchSource:0}: Error finding container 280a543f592dba929991c837d9a0aeeef869603496ab2af7185e682e8276e42d: Status 404 returned error can't find the container with id 280a543f592dba929991c837d9a0aeeef869603496ab2af7185e682e8276e42d Nov 27 11:11:41 crc kubenswrapper[4807]: I1127 11:11:41.999012 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt"] Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:41.999754 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.000564 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.50053899 +0000 UTC m=+143.600037198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: W1127 11:11:42.020068 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c49e07_08ef_4b31_abb3_787a46a3fbfd.slice/crio-39a17978965253cb8ff44fcd2baff1b69f2372f7dbd0460d6e9701e781029116 WatchSource:0}: Error finding container 39a17978965253cb8ff44fcd2baff1b69f2372f7dbd0460d6e9701e781029116: Status 404 returned error can't find the container with id 39a17978965253cb8ff44fcd2baff1b69f2372f7dbd0460d6e9701e781029116 Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.026870 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt"] Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.101471 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.101928 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.601916816 +0000 UTC m=+143.701415014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.165833 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4kzvt"] Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.196285 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wk6hj"] Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.202958 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.203181 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.703158087 +0000 UTC m=+143.802656285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.203320 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.203722 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.703708084 +0000 UTC m=+143.803206282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.301306 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" event={"ID":"a14c13d8-b1d3-40c3-84c5-13f3e2060948","Type":"ContainerStarted","Data":"516d9e0cc85b89500eb25fcb56d8632d3b86baeaa7c3dfb93e106a28ea09b080"} Nov 27 11:11:42 crc kubenswrapper[4807]: W1127 11:11:42.304942 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode38d5152_81eb_46c2_9753_84286838528f.slice/crio-c0027e9f2fecac93984c1920b260ec0843f99c4c12499013c66c65798f017b15 WatchSource:0}: Error finding container c0027e9f2fecac93984c1920b260ec0843f99c4c12499013c66c65798f017b15: Status 404 returned error can't find the container with id c0027e9f2fecac93984c1920b260ec0843f99c4c12499013c66c65798f017b15 Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.305494 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.305986 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.805970326 +0000 UTC m=+143.905468514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.310801 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" event={"ID":"4edf81da-33cf-4824-9bac-4926f59ccea1","Type":"ContainerStarted","Data":"280a543f592dba929991c837d9a0aeeef869603496ab2af7185e682e8276e42d"} Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.314120 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jdsqc" event={"ID":"93c49e07-08ef-4b31-abb3-787a46a3fbfd","Type":"ContainerStarted","Data":"39a17978965253cb8ff44fcd2baff1b69f2372f7dbd0460d6e9701e781029116"} Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.316018 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" event={"ID":"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b","Type":"ContainerStarted","Data":"bcd0474079e29b5877ab352b7815006602dead6af6157f08bc8d415e146ec1a8"} Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.319088 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" event={"ID":"8c1dab92-eafe-4867-a625-c1e7404c7cf0","Type":"ContainerStarted","Data":"aa9796351fa6462da8936dd9979fb4abcde14d5c930065b7c7e97266a39a9737"} Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.320584 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" event={"ID":"648ab531-aea8-438e-9a6a-f827594e98b4","Type":"ContainerStarted","Data":"4e06ad737ce76c08ad2c3b69ed067c2fb1c4faff4a1811a81755b88418389486"} Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.322258 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp" event={"ID":"edf54281-ae68-49ee-ad29-b47f066f43df","Type":"ContainerStarted","Data":"773f76d1cd42ce80c85eacc16cd4387aa537edf8b2952bedcd119470fbeab611"} Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.323499 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7kpfm" event={"ID":"0a7ac40a-0ecf-482a-9353-e6c71787da7e","Type":"ContainerStarted","Data":"b14180475ab588955f3776b35192e40d5e718796a727a55ed14a01814d9dff19"} Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.325209 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" event={"ID":"41151abf-171c-435c-ae8c-172e6c55ba6c","Type":"ContainerStarted","Data":"6697cae73ec35bbf0268b53098e7f46ee46497fb5a14b1b6cff9897f18909cfd"} Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.326378 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" event={"ID":"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14","Type":"ContainerStarted","Data":"314c4106a406c3c9c008edbccdaab3c274b45058e338efa287c7c7e1887f4bec"} Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.327804 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" event={"ID":"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d","Type":"ContainerStarted","Data":"0abd067871df3bdab42da1d3a43654cfb43f30475ca14cdfb269e3028c5608af"} Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.328435 4807 patch_prober.go:28] interesting pod/downloads-7954f5f757-jbssf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.328483 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jbssf" podUID="3daa1b2f-7da1-475f-8807-299bcf8423ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.407630 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.410210 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:42.910193097 +0000 UTC m=+144.009691295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.509293 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.509403 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.009376297 +0000 UTC m=+144.108874505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.509850 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.510310 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.010301155 +0000 UTC m=+144.109799353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.610902 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.611226 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.111209867 +0000 UTC m=+144.210708065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.711852 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.712273 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.212238562 +0000 UTC m=+144.311736760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.731195 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-2fmnh" podStartSLOduration=124.730231817 podStartE2EDuration="2m4.730231817s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:42.686598629 +0000 UTC m=+143.786096847" watchObservedRunningTime="2025-11-27 11:11:42.730231817 +0000 UTC m=+143.829730015" Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.812694 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.812998 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.312984089 +0000 UTC m=+144.412482287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.849477 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-68g9f" podStartSLOduration=123.849460904 podStartE2EDuration="2m3.849460904s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:42.848607019 +0000 UTC m=+143.948105217" watchObservedRunningTime="2025-11-27 11:11:42.849460904 +0000 UTC m=+143.948959102" Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.849678 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jbssf" podStartSLOduration=124.849673821 podStartE2EDuration="2m4.849673821s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:42.818761381 +0000 UTC m=+143.918259579" watchObservedRunningTime="2025-11-27 11:11:42.849673821 +0000 UTC m=+143.949172019" Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.908364 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj"] Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.912800 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j"] Nov 27 11:11:42 crc kubenswrapper[4807]: I1127 11:11:42.913512 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:42 crc kubenswrapper[4807]: E1127 11:11:42.913852 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.4138413 +0000 UTC m=+144.513339498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:42 crc kubenswrapper[4807]: W1127 11:11:42.948383 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1689f4c3_757a_4e7c_a8b5_80177a556a0c.slice/crio-969775f9858a415b5180aa4083b49ab331c3b8a737bb199bb12a938e8474598a WatchSource:0}: Error finding container 969775f9858a415b5180aa4083b49ab331c3b8a737bb199bb12a938e8474598a: Status 404 returned error can't find the container with id 969775f9858a415b5180aa4083b49ab331c3b8a737bb199bb12a938e8474598a Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.015107 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.015471 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.515450362 +0000 UTC m=+144.614948560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.015543 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.015976 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.515958238 +0000 UTC m=+144.615456526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.117358 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.118173 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.618158948 +0000 UTC m=+144.717657146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.182651 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.202769 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:43 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:43 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:43 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.202822 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.219442 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.219727 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.719715029 +0000 UTC m=+144.819213227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.297414 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rqc9w"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.320350 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.320399 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.320978 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.321298 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.821284321 +0000 UTC m=+144.920782509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.325604 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-82ntv"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.327765 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qsdql"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.332324 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.337225 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s"] Nov 27 11:11:43 crc kubenswrapper[4807]: W1127 11:11:43.350651 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a96dd38_5283_4cea_a3d4_623c6a5191a6.slice/crio-162df874683b2d9bf2e8fefa5263b7af5bfae0e5feac41abc37fb511a4d540b5 WatchSource:0}: Error finding container 162df874683b2d9bf2e8fefa5263b7af5bfae0e5feac41abc37fb511a4d540b5: Status 404 returned error can't find the container with id 162df874683b2d9bf2e8fefa5263b7af5bfae0e5feac41abc37fb511a4d540b5 Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.389540 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-w4ddg" podStartSLOduration=125.389525251 podStartE2EDuration="2m5.389525251s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.389049297 +0000 UTC m=+144.488547495" watchObservedRunningTime="2025-11-27 11:11:43.389525251 +0000 UTC m=+144.489023449" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.392063 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" event={"ID":"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79","Type":"ContainerStarted","Data":"5d9b1685bdeaff000a28be65b7abec8b54792026ee28538c627f6d7b0db9e64e"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.392094 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" event={"ID":"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79","Type":"ContainerStarted","Data":"f9f2ea696e9f76c3933c99530f2ab78d46b0a5acc360646e6eeb854a7bb17ed8"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.408076 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-kp2d4" podStartSLOduration=125.408058212 podStartE2EDuration="2m5.408058212s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.407908768 +0000 UTC m=+144.507406966" watchObservedRunningTime="2025-11-27 11:11:43.408058212 +0000 UTC m=+144.507556410" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.409231 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" event={"ID":"e38d5152-81eb-46c2-9753-84286838528f","Type":"ContainerStarted","Data":"fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.409310 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" event={"ID":"e38d5152-81eb-46c2-9753-84286838528f","Type":"ContainerStarted","Data":"c0027e9f2fecac93984c1920b260ec0843f99c4c12499013c66c65798f017b15"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.409722 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.422935 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.423716 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:43.923705048 +0000 UTC m=+145.023203246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.427941 4807 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4kzvt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.427993 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" podUID="e38d5152-81eb-46c2-9753-84286838528f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.428543 4807 generic.go:334] "Generic (PLEG): container finished" podID="7e163b55-77cb-4055-9d4d-bef2d98d0139" containerID="33e0a59f884c0e0e4f64f87ad57eeb76b95b8ae1b5960da89565ec8685a103c8" exitCode=0 Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.428615 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" event={"ID":"7e163b55-77cb-4055-9d4d-bef2d98d0139","Type":"ContainerDied","Data":"33e0a59f884c0e0e4f64f87ad57eeb76b95b8ae1b5960da89565ec8685a103c8"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.428654 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" event={"ID":"7e163b55-77cb-4055-9d4d-bef2d98d0139","Type":"ContainerStarted","Data":"70ad91cd88017a2ac219c1d8f722b7a73becdc9db40a432e75832136606c5a7f"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.452667 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" event={"ID":"a0a86502-013f-4060-91d7-cf5cd9353ccf","Type":"ContainerStarted","Data":"94552639f1fbb922c571ff75b294cfdcff55030ef4f4b345857d6e52522548ce"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.452710 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" event={"ID":"a0a86502-013f-4060-91d7-cf5cd9353ccf","Type":"ContainerStarted","Data":"3c0273180f6b6514e7173d2f27c5c47381f28904aad828f72342d2c7ab7f7f5b"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.483970 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" event={"ID":"a14c13d8-b1d3-40c3-84c5-13f3e2060948","Type":"ContainerStarted","Data":"d77e62c304593e62435ef08c9ad54a984ddab83539208a8b349ea62629c4b1fc"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.516053 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-7kpfm" podStartSLOduration=125.516035904 podStartE2EDuration="2m5.516035904s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.501355828 +0000 UTC m=+144.600854036" watchObservedRunningTime="2025-11-27 11:11:43.516035904 +0000 UTC m=+144.615534102" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.518348 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lhr72"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.522614 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" event={"ID":"648ab531-aea8-438e-9a6a-f827594e98b4","Type":"ContainerStarted","Data":"b69ce7230ed8711a77515a9608bc519b5fe92f85a9bd5ff61d9d6364bd5955aa"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.523996 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.524501 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.525345 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:44.025330071 +0000 UTC m=+145.124828269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.556846 4807 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-sxp7z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.556971 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" podUID="648ab531-aea8-438e-9a6a-f827594e98b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.562953 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rqbbk"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.563099 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p9f4b"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.563164 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-pvv9r"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.563222 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.574402 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.575350 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp" event={"ID":"edf54281-ae68-49ee-ad29-b47f066f43df","Type":"ContainerStarted","Data":"3bb27a6bcd02fb57ccd26c9502e053c4ab14d732c32006fb072932e1451a2691"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.575392 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp" event={"ID":"edf54281-ae68-49ee-ad29-b47f066f43df","Type":"ContainerStarted","Data":"c5747fbf3eaf1f683e8fd0c90369cec0884ca94fa39e9617c7fda03625d1fb08"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.577632 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" event={"ID":"5dbdfb93-aeb2-4e7a-ab5f-7c9ff0c42b9d","Type":"ContainerStarted","Data":"3eb7f2440b507a33c1f64ed2593ac09c72ae96bcab9085001905ad82f43747af"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.590652 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.594333 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" event={"ID":"c32125c5-e2ef-4ca5-a5d1-1e4bf1fd1b3b","Type":"ContainerStarted","Data":"007d7d21ba51b917bcee0c14c5614ed0b7f287ac25b6184896519d7859c3ce2a"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.596842 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-88rr6" event={"ID":"4000ce66-354d-4cac-9da2-16d78c31b056","Type":"ContainerStarted","Data":"fec3424c602e3fac4e3f19c202add849b63bab1c66015dbd7a1a0c04ccbd5f77"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.596868 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-88rr6" event={"ID":"4000ce66-354d-4cac-9da2-16d78c31b056","Type":"ContainerStarted","Data":"47344fb51c1ea216fb399a5a0a7170824e55c36b89eea332be0f03b0eed1ca94"} Nov 27 11:11:43 crc kubenswrapper[4807]: W1127 11:11:43.612143 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30c4595c_6d61_4e1a_a92a_db32926bae0b.slice/crio-23f681a37e3880a0998f27312ae8f47965b61c2984ca8e9b77058cc1d45896fc WatchSource:0}: Error finding container 23f681a37e3880a0998f27312ae8f47965b61c2984ca8e9b77058cc1d45896fc: Status 404 returned error can't find the container with id 23f681a37e3880a0998f27312ae8f47965b61c2984ca8e9b77058cc1d45896fc Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.613272 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" event={"ID":"41151abf-171c-435c-ae8c-172e6c55ba6c","Type":"ContainerStarted","Data":"bf212b3729cd6dfd6c7de4d3ce6e525dd27fa9479ddeb4e348684e6cd584b30c"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.613301 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" event={"ID":"41151abf-171c-435c-ae8c-172e6c55ba6c","Type":"ContainerStarted","Data":"e6b15136a0e970fa6228bcb57361688075f035034e8b467e8d9f830e36bd5253"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.625047 4807 generic.go:334] "Generic (PLEG): container finished" podID="8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14" containerID="b50c6bfe30cce456c1d830b78c091e65f299271b3e0277e1ecc2292b9ce07734" exitCode=0 Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.625106 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" event={"ID":"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14","Type":"ContainerDied","Data":"b50c6bfe30cce456c1d830b78c091e65f299271b3e0277e1ecc2292b9ce07734"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.625993 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.627892 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:44.127876082 +0000 UTC m=+145.227374280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.636150 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" event={"ID":"1689f4c3-757a-4e7c-a8b5-80177a556a0c","Type":"ContainerStarted","Data":"7bfb4092ef266ce5884a45eb47b6a6adebec22cc109ffa9dcae6fc361ddd5c31"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.636202 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" event={"ID":"1689f4c3-757a-4e7c-a8b5-80177a556a0c","Type":"ContainerStarted","Data":"969775f9858a415b5180aa4083b49ab331c3b8a737bb199bb12a938e8474598a"} Nov 27 11:11:43 crc kubenswrapper[4807]: W1127 11:11:43.647444 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod035df5f8_99c1_4c5c_bb47_87bd5057313e.slice/crio-bfd229033e9e713dc63ebb17580bc0448e68980b2a80f41c05ae9adbe1ac19b6 WatchSource:0}: Error finding container bfd229033e9e713dc63ebb17580bc0448e68980b2a80f41c05ae9adbe1ac19b6: Status 404 returned error can't find the container with id bfd229033e9e713dc63ebb17580bc0448e68980b2a80f41c05ae9adbe1ac19b6 Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.656531 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jdsqc" event={"ID":"93c49e07-08ef-4b31-abb3-787a46a3fbfd","Type":"ContainerStarted","Data":"1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.694110 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gh6dt" podStartSLOduration=124.694090221 podStartE2EDuration="2m4.694090221s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.65269231 +0000 UTC m=+144.752190508" watchObservedRunningTime="2025-11-27 11:11:43.694090221 +0000 UTC m=+144.793588409" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.695646 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jvchs"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.700001 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" event={"ID":"8c1dab92-eafe-4867-a625-c1e7404c7cf0","Type":"ContainerStarted","Data":"295b7de7b51846b21d985c729d06713649818ba458df3ef8cf58479ee664ad07"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.700851 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.706773 4807 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-vnpgt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.706819 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" podUID="8c1dab92-eafe-4867-a625-c1e7404c7cf0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.742694 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" event={"ID":"4edf81da-33cf-4824-9bac-4926f59ccea1","Type":"ContainerStarted","Data":"71550c4d0aeaab41573c40682661076ae06f728405107d6d1c855398b7a37c84"} Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.743053 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.748420 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n"] Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.749387 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:44.249352085 +0000 UTC m=+145.348850293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.762478 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.775299 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz"] Nov 27 11:11:43 crc kubenswrapper[4807]: W1127 11:11:43.777736 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod658b2bd5_04cf_410b_a749_e0e67246ac4c.slice/crio-4439f152df67feb16a3ea02f3d1f004afa75ab0b7e8f24e0e9fe59a5b89b9e94 WatchSource:0}: Error finding container 4439f152df67feb16a3ea02f3d1f004afa75ab0b7e8f24e0e9fe59a5b89b9e94: Status 404 returned error can't find the container with id 4439f152df67feb16a3ea02f3d1f004afa75ab0b7e8f24e0e9fe59a5b89b9e94 Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.780885 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj"] Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.784431 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" podStartSLOduration=125.784411408 podStartE2EDuration="2m5.784411408s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.759663142 +0000 UTC m=+144.859161340" watchObservedRunningTime="2025-11-27 11:11:43.784411408 +0000 UTC m=+144.883909606" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.798445 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ldgcp" podStartSLOduration=124.798430635 podStartE2EDuration="2m4.798430635s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.798089485 +0000 UTC m=+144.897587683" watchObservedRunningTime="2025-11-27 11:11:43.798430635 +0000 UTC m=+144.897928833" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.829514 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" podStartSLOduration=124.829480369 podStartE2EDuration="2m4.829480369s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.826929483 +0000 UTC m=+144.926427691" watchObservedRunningTime="2025-11-27 11:11:43.829480369 +0000 UTC m=+144.928978567" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.853837 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.854748 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:44.35472298 +0000 UTC m=+145.454221168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.856281 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9jvwr" podStartSLOduration=125.856270366 podStartE2EDuration="2m5.856270366s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.855845343 +0000 UTC m=+144.955343541" watchObservedRunningTime="2025-11-27 11:11:43.856270366 +0000 UTC m=+144.955768564" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.908934 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-m2dvt" podStartSLOduration=125.908903082 podStartE2EDuration="2m5.908903082s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.899280526 +0000 UTC m=+144.998778724" watchObservedRunningTime="2025-11-27 11:11:43.908903082 +0000 UTC m=+145.008401280" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.954869 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:43 crc kubenswrapper[4807]: E1127 11:11:43.955152 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:44.455137817 +0000 UTC m=+145.554636015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.981596 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-88rr6" podStartSLOduration=6.981580584 podStartE2EDuration="6.981580584s" podCreationTimestamp="2025-11-27 11:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.933624447 +0000 UTC m=+145.033122645" watchObservedRunningTime="2025-11-27 11:11:43.981580584 +0000 UTC m=+145.081078782" Nov 27 11:11:43 crc kubenswrapper[4807]: I1127 11:11:43.982025 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-n7cfz" podStartSLOduration=125.982018857 podStartE2EDuration="2m5.982018857s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:43.981366348 +0000 UTC m=+145.080864556" watchObservedRunningTime="2025-11-27 11:11:43.982018857 +0000 UTC m=+145.081517055" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.022733 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zk8cm" podStartSLOduration=126.022712138 podStartE2EDuration="2m6.022712138s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.015644877 +0000 UTC m=+145.115143075" watchObservedRunningTime="2025-11-27 11:11:44.022712138 +0000 UTC m=+145.122210336" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.055804 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gxsrj" podStartSLOduration=125.055786371 podStartE2EDuration="2m5.055786371s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.053972478 +0000 UTC m=+145.153470676" watchObservedRunningTime="2025-11-27 11:11:44.055786371 +0000 UTC m=+145.155284569" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.063590 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.064340 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:44.564327286 +0000 UTC m=+145.663825484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.097725 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-jdsqc" podStartSLOduration=126.097711829 podStartE2EDuration="2m6.097711829s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.095625677 +0000 UTC m=+145.195123865" watchObservedRunningTime="2025-11-27 11:11:44.097711829 +0000 UTC m=+145.197210027" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.165673 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.166706 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:44.666691851 +0000 UTC m=+145.766190049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.171718 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" podStartSLOduration=125.17170711 podStartE2EDuration="2m5.17170711s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.17135694 +0000 UTC m=+145.270855138" watchObservedRunningTime="2025-11-27 11:11:44.17170711 +0000 UTC m=+145.271205308" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.189401 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:44 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:44 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:44 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.189444 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.268239 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.268832 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:44.768819919 +0000 UTC m=+145.868318117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.372751 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.373081 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:44.87306532 +0000 UTC m=+145.972563518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.474208 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.474618 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:44.974603231 +0000 UTC m=+146.074101429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.576679 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.576983 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.076970176 +0000 UTC m=+146.176468374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.678152 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.678505 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.178494107 +0000 UTC m=+146.277992305 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.758897 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p9f4b" event={"ID":"45212442-181e-4e28-a388-e87c6dce9bac","Type":"ContainerStarted","Data":"525e373efea7c631cf0a4cbb5ab68771197fc03cd1145b2b0fc1918b355f61ae"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.758937 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p9f4b" event={"ID":"45212442-181e-4e28-a388-e87c6dce9bac","Type":"ContainerStarted","Data":"60544c3165e48fd5811bcd11e3c61966cdee4d56428ccb46204bd0bc9625734f"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.773657 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p9f4b" podStartSLOduration=7.773634847 podStartE2EDuration="7.773634847s" podCreationTimestamp="2025-11-27 11:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.770627107 +0000 UTC m=+145.870125305" watchObservedRunningTime="2025-11-27 11:11:44.773634847 +0000 UTC m=+145.873133045" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.773700 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" event={"ID":"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f","Type":"ContainerStarted","Data":"e290c6bc63e844321153c8fb76af9993307ef802c3d47aa2fb7ea184738ea83a"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.773747 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" event={"ID":"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f","Type":"ContainerStarted","Data":"961a157e9a540af51c07f2484b74ca3276bf69d3c5219f878a894cd01cf0746a"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.777633 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" event={"ID":"8e8444dc-ad7b-41c1-b524-ebd0e453eb46","Type":"ContainerStarted","Data":"934ca4a33968a0e3f8b06818a673d01f9929a699108f6b9dccf720f48ee7a85d"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.777677 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" event={"ID":"8e8444dc-ad7b-41c1-b524-ebd0e453eb46","Type":"ContainerStarted","Data":"6c2d1fa0e0b5ede938387ab9666ec8813b804c2796752a7b62d9cf9727a3b117"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.779625 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.779964 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.279951075 +0000 UTC m=+146.379449273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.780618 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" event={"ID":"b8224b35-5290-4943-987c-3101d802b811","Type":"ContainerStarted","Data":"f373b612db0020d548e761456ffb246a9c2267bcc9e770d8386d05c073a35ece"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.780651 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" event={"ID":"b8224b35-5290-4943-987c-3101d802b811","Type":"ContainerStarted","Data":"b2809ec3d653b998f6e7eff098abef1c8572b262b2b2e7b4c436fb9044bd74cd"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.780666 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" event={"ID":"b8224b35-5290-4943-987c-3101d802b811","Type":"ContainerStarted","Data":"64800870c11a6bf225363e8ad4ffe0eb3afed5f1547e192280485dd1de0635b1"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.786000 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rqc9w" event={"ID":"eba22ccb-d6b0-4f36-8583-3a602716a832","Type":"ContainerStarted","Data":"d1a9d098fc3fec033a528335b5c9f8a5b71b3b7b89e6d5a63deb7a16afe577e3"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.786042 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rqc9w" event={"ID":"eba22ccb-d6b0-4f36-8583-3a602716a832","Type":"ContainerStarted","Data":"f6a74fe1fc0686d10cb271b82d4cfea0ed8eb7efc2a2b0755930fd12c4d2d28e"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.787929 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" event={"ID":"85f17fc8-a7a8-4fdf-88d2-7f9471b61c79","Type":"ContainerStarted","Data":"2b35a6d52d8199b4b79e596af3c85314e3aa96fb37eaed558435f53cba20951a"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.789741 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" event={"ID":"30c4595c-6d61-4e1a-a92a-db32926bae0b","Type":"ContainerStarted","Data":"dc27272c42749497e8fbf6a82e91750ad178247dfe1747c71bd7ccb5f1c93dac"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.789780 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" event={"ID":"30c4595c-6d61-4e1a-a92a-db32926bae0b","Type":"ContainerStarted","Data":"23f681a37e3880a0998f27312ae8f47965b61c2984ca8e9b77058cc1d45896fc"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.792320 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" event={"ID":"65e5ef23-71ad-40ae-81bb-e94d9d298087","Type":"ContainerStarted","Data":"6617f5552a5be9cb1d47e1efef730cfc812a511591b28d13d051ea011529efd3"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.792345 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" event={"ID":"65e5ef23-71ad-40ae-81bb-e94d9d298087","Type":"ContainerStarted","Data":"b1faebbc18104f2082a1daaa24e48f248b3a7b7f5630a4ee275b60eda9baf4d1"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.792523 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.793400 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jvchs" event={"ID":"658b2bd5-04cf-410b-a749-e0e67246ac4c","Type":"ContainerStarted","Data":"4439f152df67feb16a3ea02f3d1f004afa75ab0b7e8f24e0e9fe59a5b89b9e94"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.794275 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" event={"ID":"bbba7218-b809-4e7d-ba6e-071ff7261ba2","Type":"ContainerStarted","Data":"3485b8624a7d3ee3e7ff38f0a0134b7761afe1e729ae7d2ab12144e1d840f161"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.794296 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" event={"ID":"bbba7218-b809-4e7d-ba6e-071ff7261ba2","Type":"ContainerStarted","Data":"56ab58a2500fe264544118383dd296810380fc5860e46f68fbd47e4eb8a85b02"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.794792 4807 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-82ntv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.794823 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.798116 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" event={"ID":"8a49f46d-d1b2-4a71-b51f-be562df1fb70","Type":"ContainerStarted","Data":"387fa281eb0e56c933bc6a3d4b916c2b36e1db5af909dff36a1adae9610d9aef"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.798162 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" event={"ID":"8a49f46d-d1b2-4a71-b51f-be562df1fb70","Type":"ContainerStarted","Data":"d05d1b9d1c958ffffea07e66985765a12b37f173eb9f233a96a9ff5f05dd5de7"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.798779 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.800436 4807 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-95lnp container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.800490 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" podUID="8a49f46d-d1b2-4a71-b51f-be562df1fb70" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.802458 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" event={"ID":"168d38aa-a194-4c8e-9717-62f2d5fca760","Type":"ContainerStarted","Data":"70a8eb8ed183dda82c36ca1cf069f5f3987121a39bbcf4a6c8f3f53a6b964c62"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.802489 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" event={"ID":"168d38aa-a194-4c8e-9717-62f2d5fca760","Type":"ContainerStarted","Data":"33e26a73b33d9f7280975e6ff7c84017cb746db7aef5c5435c303036185e8d50"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.805532 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" event={"ID":"7e163b55-77cb-4055-9d4d-bef2d98d0139","Type":"ContainerStarted","Data":"c7f47c08dadbd1bac532c5ecfa6b9befa6cec62835c7f49a79e95ea850d73c08"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.808689 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" event={"ID":"035df5f8-99c1-4c5c-bb47-87bd5057313e","Type":"ContainerStarted","Data":"22ce346776f7882bb5e321ef7f64947e7523868b45cdf54bc1889d592eb2a729"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.808728 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" event={"ID":"035df5f8-99c1-4c5c-bb47-87bd5057313e","Type":"ContainerStarted","Data":"bfd229033e9e713dc63ebb17580bc0448e68980b2a80f41c05ae9adbe1ac19b6"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.810510 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" event={"ID":"a0a86502-013f-4060-91d7-cf5cd9353ccf","Type":"ContainerStarted","Data":"bc5cf55a96490b8d900b788bba7aaaa8c8f219b3ef80bd044e46c7d2f13521f8"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.813041 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" event={"ID":"d6e0ca81-eb66-426e-b8b0-1bff40087694","Type":"ContainerStarted","Data":"7a455a6512202f5f98886897e936e00fb9e1e62ce329abe7cca230958d00d9e7"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.813066 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" event={"ID":"d6e0ca81-eb66-426e-b8b0-1bff40087694","Type":"ContainerStarted","Data":"1a7d2dccadfa8dbb096079781ce0eb0df8f9e51b353bf367e3252b29146648f1"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.813266 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.816334 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" event={"ID":"acbc9004-f19e-419d-b609-2f9dda223b0d","Type":"ContainerStarted","Data":"070b8fbc8145631b85d09fdc49b50fcf20feaa04047883449ef385dc134ab75e"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.816367 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" event={"ID":"acbc9004-f19e-419d-b609-2f9dda223b0d","Type":"ContainerStarted","Data":"4803e5ec4a08da5fc2c4b304155ef2a849fca47f1c5fcd31e30c31f881e57487"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.818472 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" event={"ID":"e091aded-a48b-4eec-896c-22870c1f216d","Type":"ContainerStarted","Data":"add68cdb07a263e7ee48de4ad4f7dcc56bc0ffe81d5d7388955edd775b54da2e"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.818505 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" event={"ID":"e091aded-a48b-4eec-896c-22870c1f216d","Type":"ContainerStarted","Data":"f8f2f2bcd06b3a0918dbce5d147794ad35402e348ba60cf88d59ba3bf0ba248c"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.821874 4807 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-72xkj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.821937 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" podUID="d6e0ca81-eb66-426e-b8b0-1bff40087694" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.823793 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-ddd75" podStartSLOduration=125.823772008 podStartE2EDuration="2m5.823772008s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.817702178 +0000 UTC m=+145.917200416" watchObservedRunningTime="2025-11-27 11:11:44.823772008 +0000 UTC m=+145.923270206" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.835284 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" event={"ID":"7462250b-699f-4fff-9600-8dff49efc2e8","Type":"ContainerStarted","Data":"ef36d13432a2b1dcb6d2cbce900931c0cfdef30f63cadd500d7c9d45f95feabc"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.835324 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" event={"ID":"7462250b-699f-4fff-9600-8dff49efc2e8","Type":"ContainerStarted","Data":"c132e74c9ecf1bcfe0c0fb26a4ab006461b2e984089a0ba2a3d919581a21ba32"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.835566 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.839384 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" event={"ID":"8ef37ae7-806a-4c1b-b8b1-a4eefbb26f14","Type":"ContainerStarted","Data":"53ec89c536272b1c418ac9d1fabe86efe55e0881b58ef510170ca3dcacd62843"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.839648 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.841851 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" event={"ID":"5eeca1a6-a2a9-4c52-badc-a9d78405979c","Type":"ContainerStarted","Data":"1d55f8a97b0a9cdc57b9e0fe8fceecd1bed2769f9264c9e77760cba7b2ebe5e9"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.841892 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" event={"ID":"5eeca1a6-a2a9-4c52-badc-a9d78405979c","Type":"ContainerStarted","Data":"8b8e8c77c9565cbbdad2050cba2ecaf28fcc50c661953995fa8006db84221a46"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.844893 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gffzl" podStartSLOduration=126.844874026 podStartE2EDuration="2m6.844874026s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.843682851 +0000 UTC m=+145.943181049" watchObservedRunningTime="2025-11-27 11:11:44.844874026 +0000 UTC m=+145.944372224" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.846554 4807 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-qsdql container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.846606 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" podUID="7462250b-699f-4fff-9600-8dff49efc2e8" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.848607 4807 generic.go:334] "Generic (PLEG): container finished" podID="6711e08e-05e5-4111-9e1e-fd5ed988a718" containerID="510da0e664d63764f0080bf6cbb63a46f487d4fb6123cc91e1fa1c5b0dd79d62" exitCode=0 Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.848691 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" event={"ID":"6711e08e-05e5-4111-9e1e-fd5ed988a718","Type":"ContainerDied","Data":"510da0e664d63764f0080bf6cbb63a46f487d4fb6123cc91e1fa1c5b0dd79d62"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.848717 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" event={"ID":"6711e08e-05e5-4111-9e1e-fd5ed988a718","Type":"ContainerStarted","Data":"111362b22cdbf9a0b202806cf8e523c77d46068bdb26866b74587ac2280f678d"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.851130 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" event={"ID":"8a96dd38-5283-4cea-a3d4-623c6a5191a6","Type":"ContainerStarted","Data":"eae2323e98b6df102f7cc030df8199c4b7a0c5e96de1bf23b2d0fcd1ef0db8e6"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.851150 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" event={"ID":"8a96dd38-5283-4cea-a3d4-623c6a5191a6","Type":"ContainerStarted","Data":"162df874683b2d9bf2e8fefa5263b7af5bfae0e5feac41abc37fb511a4d540b5"} Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.860998 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-vnpgt" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.864064 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.864642 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lqp4t" podStartSLOduration=126.864620314 podStartE2EDuration="2m6.864620314s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.860330886 +0000 UTC m=+145.959829084" watchObservedRunningTime="2025-11-27 11:11:44.864620314 +0000 UTC m=+145.964118502" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.882636 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.883142 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.383129783 +0000 UTC m=+146.482627981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.903386 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rqbbk" podStartSLOduration=125.903366835 podStartE2EDuration="2m5.903366835s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.881179125 +0000 UTC m=+145.980677323" watchObservedRunningTime="2025-11-27 11:11:44.903366835 +0000 UTC m=+146.002865033" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.906942 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.932591 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vmm8j" podStartSLOduration=126.932572474 podStartE2EDuration="2m6.932572474s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.905084276 +0000 UTC m=+146.004582474" watchObservedRunningTime="2025-11-27 11:11:44.932572474 +0000 UTC m=+146.032070672" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.950580 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" podStartSLOduration=126.950563629 podStartE2EDuration="2m6.950563629s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.933386898 +0000 UTC m=+146.032885096" watchObservedRunningTime="2025-11-27 11:11:44.950563629 +0000 UTC m=+146.050061827" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.951505 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" podStartSLOduration=125.951501207 podStartE2EDuration="2m5.951501207s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.949187598 +0000 UTC m=+146.048685796" watchObservedRunningTime="2025-11-27 11:11:44.951501207 +0000 UTC m=+146.050999405" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.971953 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" podStartSLOduration=125.971940715 podStartE2EDuration="2m5.971940715s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.969957016 +0000 UTC m=+146.069455214" watchObservedRunningTime="2025-11-27 11:11:44.971940715 +0000 UTC m=+146.071438913" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.981689 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" podStartSLOduration=125.981673615 podStartE2EDuration="2m5.981673615s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.981638234 +0000 UTC m=+146.081136432" watchObservedRunningTime="2025-11-27 11:11:44.981673615 +0000 UTC m=+146.081171813" Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.983962 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.984048 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.484034395 +0000 UTC m=+146.583532593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:44 crc kubenswrapper[4807]: I1127 11:11:44.985387 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:44 crc kubenswrapper[4807]: E1127 11:11:44.986590 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.486573381 +0000 UTC m=+146.586071579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.012575 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zjcm8" podStartSLOduration=127.012557504 podStartE2EDuration="2m7.012557504s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:44.996787244 +0000 UTC m=+146.096285442" watchObservedRunningTime="2025-11-27 11:11:45.012557504 +0000 UTC m=+146.112055702" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.042473 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v9g2c" podStartSLOduration=126.042460853 podStartE2EDuration="2m6.042460853s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:45.041574607 +0000 UTC m=+146.141072805" watchObservedRunningTime="2025-11-27 11:11:45.042460853 +0000 UTC m=+146.141959051" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.090542 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.090711 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.590685518 +0000 UTC m=+146.690183716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.090843 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" podStartSLOduration=127.090819932 podStartE2EDuration="2m7.090819932s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:45.074257239 +0000 UTC m=+146.173755437" watchObservedRunningTime="2025-11-27 11:11:45.090819932 +0000 UTC m=+146.190318130" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.090893 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.091150 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.591137401 +0000 UTC m=+146.690635599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.104828 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xrg5s" podStartSLOduration=126.104809838 podStartE2EDuration="2m6.104809838s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:45.103138918 +0000 UTC m=+146.202637116" watchObservedRunningTime="2025-11-27 11:11:45.104809838 +0000 UTC m=+146.204308036" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.136423 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" podStartSLOduration=127.136405828 podStartE2EDuration="2m7.136405828s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:45.135341836 +0000 UTC m=+146.234840034" watchObservedRunningTime="2025-11-27 11:11:45.136405828 +0000 UTC m=+146.235904026" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.187769 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:45 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:45 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:45 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.187834 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.192331 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.192649 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.692632991 +0000 UTC m=+146.792131189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.192793 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.193043 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.693035283 +0000 UTC m=+146.792533481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.293791 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.293889 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.793874493 +0000 UTC m=+146.893372691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.294142 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.294533 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.794514562 +0000 UTC m=+146.894012760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.395449 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.395620 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.895596719 +0000 UTC m=+146.995094917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.395705 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.396125 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.896110694 +0000 UTC m=+146.995608892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.498157 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.498318 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.998296484 +0000 UTC m=+147.097794682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.498519 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.498801 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:45.998788159 +0000 UTC m=+147.098286357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.599606 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.600196 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.100182415 +0000 UTC m=+147.199680613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.701551 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.701895 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.20188218 +0000 UTC m=+147.301380378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.802196 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.802535 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.302518204 +0000 UTC m=+147.402016402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.858438 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" event={"ID":"7e163b55-77cb-4055-9d4d-bef2d98d0139","Type":"ContainerStarted","Data":"3d6214dae972e74c2889d5eec9b2f1bdb3d7e6660f5427421ab7be924f366c11"} Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.860816 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" event={"ID":"30c4595c-6d61-4e1a-a92a-db32926bae0b","Type":"ContainerStarted","Data":"4e45827b2765c9d991a50f7c376b26e039c111989c91200dfdcb71136de0f399"} Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.862891 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" event={"ID":"e091aded-a48b-4eec-896c-22870c1f216d","Type":"ContainerStarted","Data":"7d45d4b390ce277ce8ed1ca56aa50d30f3c3860c5131fb1ee3a90e902aecb1c6"} Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.865268 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" event={"ID":"d0fee666-2d95-4330-a8aa-4ab1ca30bb5f","Type":"ContainerStarted","Data":"3cccc363d371bd60c7e0692d35cf1735b3cb219dd6557d7a512208e902225299"} Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.867774 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rqc9w" event={"ID":"eba22ccb-d6b0-4f36-8583-3a602716a832","Type":"ContainerStarted","Data":"003072fb5602979cbce3d6980c9f733d5690b7be81ff44708e2933108bd6364d"} Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.868125 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.869672 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" event={"ID":"8e8444dc-ad7b-41c1-b524-ebd0e453eb46","Type":"ContainerStarted","Data":"58b8de4069eccdb0e3e32439f58140df5f05395609e272af0d98b77ab9c93898"} Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.870203 4807 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-qsdql container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.870256 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" podUID="7462250b-699f-4fff-9600-8dff49efc2e8" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.871664 4807 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-72xkj container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.871691 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" podUID="d6e0ca81-eb66-426e-b8b0-1bff40087694" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.871750 4807 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-82ntv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.871800 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.872842 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.892962 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-95lnp" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.903788 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:45 crc kubenswrapper[4807]: E1127 11:11:45.905384 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.405373104 +0000 UTC m=+147.504871302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.947222 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" podStartSLOduration=127.94720576899999 podStartE2EDuration="2m7.947205769s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:45.946612211 +0000 UTC m=+147.046110419" watchObservedRunningTime="2025-11-27 11:11:45.947205769 +0000 UTC m=+147.046703967" Nov 27 11:11:45 crc kubenswrapper[4807]: I1127 11:11:45.982975 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rqc9w" podStartSLOduration=8.982957542 podStartE2EDuration="8.982957542s" podCreationTimestamp="2025-11-27 11:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:45.976308394 +0000 UTC m=+147.075806592" watchObservedRunningTime="2025-11-27 11:11:45.982957542 +0000 UTC m=+147.082455730" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.004267 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bst7b" podStartSLOduration=127.004253006 podStartE2EDuration="2m7.004253006s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:45.999941187 +0000 UTC m=+147.099439385" watchObservedRunningTime="2025-11-27 11:11:46.004253006 +0000 UTC m=+147.103751214" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.004746 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.005934 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.505893985 +0000 UTC m=+147.605392233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.106572 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.106948 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.60693149 +0000 UTC m=+147.706429688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.169354 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lhr72" podStartSLOduration=127.169337317 podStartE2EDuration="2m7.169337317s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:46.136468269 +0000 UTC m=+147.235966467" watchObservedRunningTime="2025-11-27 11:11:46.169337317 +0000 UTC m=+147.268835515" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.187035 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:46 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:46 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:46 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.187088 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.208892 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.209209 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.709194453 +0000 UTC m=+147.808692651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.233109 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-pvv9r" podStartSLOduration=127.233092464 podStartE2EDuration="2m7.233092464s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:46.23095139 +0000 UTC m=+147.330449588" watchObservedRunningTime="2025-11-27 11:11:46.233092464 +0000 UTC m=+147.332590662" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.234450 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" podStartSLOduration=127.234445304 podStartE2EDuration="2m7.234445304s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:46.195901137 +0000 UTC m=+147.295399335" watchObservedRunningTime="2025-11-27 11:11:46.234445304 +0000 UTC m=+147.333943502" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.311496 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.311903 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.811887558 +0000 UTC m=+147.911385756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.412998 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.413328 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:46.913292424 +0000 UTC m=+148.012790622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.514451 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.514738 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.014725382 +0000 UTC m=+148.114223580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.546290 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.546409 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.548187 4807 patch_prober.go:28] interesting pod/apiserver-76f77b778f-wk6hj container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.548237 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" podUID="7e163b55-77cb-4055-9d4d-bef2d98d0139" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.615552 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.615769 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.115747147 +0000 UTC m=+148.215245345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.616028 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.616339 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.116326235 +0000 UTC m=+148.215824433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.717221 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.717653 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.217635198 +0000 UTC m=+148.317133396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.818573 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.818919 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.318907781 +0000 UTC m=+148.418405969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.877611 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" event={"ID":"6711e08e-05e5-4111-9e1e-fd5ed988a718","Type":"ContainerStarted","Data":"26354ff20092c1bf99cd1ee5ffc06bc001073aa14102ec60ec43af0eed6687e8"} Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.879028 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jvchs" event={"ID":"658b2bd5-04cf-410b-a749-e0e67246ac4c","Type":"ContainerStarted","Data":"fd7b474beebb85db29fbef9844a4efd4b0027733686afbff6fb79381233ef334"} Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.919336 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.919482 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.419458742 +0000 UTC m=+148.518956940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.919787 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.919979 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.920278 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:46 crc kubenswrapper[4807]: E1127 11:11:46.920906 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.420890015 +0000 UTC m=+148.520388213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.921436 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.924653 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" podStartSLOduration=127.924641237 podStartE2EDuration="2m7.924641237s" podCreationTimestamp="2025-11-27 11:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:46.919275197 +0000 UTC m=+148.018773395" watchObservedRunningTime="2025-11-27 11:11:46.924641237 +0000 UTC m=+148.024139435" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.928606 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.961838 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4zn8d"] Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.962764 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.969478 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 27 11:11:46 crc kubenswrapper[4807]: I1127 11:11:46.977438 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4zn8d"] Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.021192 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.021420 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.521396325 +0000 UTC m=+148.620894523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.021457 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.021543 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-catalog-content\") pod \"community-operators-4zn8d\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.021575 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.021627 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg42b\" (UniqueName: \"kubernetes.io/projected/7984b376-029e-465e-893e-f62f047ee418-kube-api-access-jg42b\") pod \"community-operators-4zn8d\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.021668 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.021714 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-utilities\") pod \"community-operators-4zn8d\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.022206 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.522193439 +0000 UTC m=+148.621691637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.024820 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.028762 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.122445 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.122638 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.622613816 +0000 UTC m=+148.722112014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.122709 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.122749 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-utilities\") pod \"community-operators-4zn8d\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.122806 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-catalog-content\") pod \"community-operators-4zn8d\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.122838 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg42b\" (UniqueName: \"kubernetes.io/projected/7984b376-029e-465e-893e-f62f047ee418-kube-api-access-jg42b\") pod \"community-operators-4zn8d\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.123058 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.623050869 +0000 UTC m=+148.722549067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.123331 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-catalog-content\") pod \"community-operators-4zn8d\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.123593 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-utilities\") pod \"community-operators-4zn8d\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.138471 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg42b\" (UniqueName: \"kubernetes.io/projected/7984b376-029e-465e-893e-f62f047ee418-kube-api-access-jg42b\") pod \"community-operators-4zn8d\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.146641 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.161457 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.170412 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.188375 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:47 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:47 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:47 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.188422 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.223787 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.224356 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.724341352 +0000 UTC m=+148.823839550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.242848 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-72xkj" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.279554 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.325562 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.325868 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.825857342 +0000 UTC m=+148.925355541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.326693 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tmwd8"] Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.327594 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.347843 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmwd8"] Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.427984 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.428235 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-utilities\") pod \"community-operators-tmwd8\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.428345 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-catalog-content\") pod \"community-operators-tmwd8\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.428365 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qxg4\" (UniqueName: \"kubernetes.io/projected/6fac2baf-01de-4e80-8434-4488846fd7fb-kube-api-access-5qxg4\") pod \"community-operators-tmwd8\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.428450 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:47.928436204 +0000 UTC m=+149.027934402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.535824 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.535869 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-catalog-content\") pod \"community-operators-tmwd8\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.535897 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qxg4\" (UniqueName: \"kubernetes.io/projected/6fac2baf-01de-4e80-8434-4488846fd7fb-kube-api-access-5qxg4\") pod \"community-operators-tmwd8\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.535920 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-utilities\") pod \"community-operators-tmwd8\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.536322 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-utilities\") pod \"community-operators-tmwd8\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.536551 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:48.03654118 +0000 UTC m=+149.136039378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.536872 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-catalog-content\") pod \"community-operators-tmwd8\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.569570 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qxg4\" (UniqueName: \"kubernetes.io/projected/6fac2baf-01de-4e80-8434-4488846fd7fb-kube-api-access-5qxg4\") pod \"community-operators-tmwd8\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.582411 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f8qqm"] Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.583334 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f8qqm"] Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.583420 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.588671 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.647691 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.647877 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj497\" (UniqueName: \"kubernetes.io/projected/16652a33-af22-4522-bd9c-8491bd6ae24f-kube-api-access-tj497\") pod \"certified-operators-f8qqm\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.647915 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-utilities\") pod \"certified-operators-f8qqm\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.647951 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-catalog-content\") pod \"certified-operators-f8qqm\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.648076 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:48.148061678 +0000 UTC m=+149.247559876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.683382 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.692807 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.693523 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.713804 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.714101 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.765361 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/993c4b20-6e07-41fc-b269-6d716c9c25c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"993c4b20-6e07-41fc-b269-6d716c9c25c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.765431 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj497\" (UniqueName: \"kubernetes.io/projected/16652a33-af22-4522-bd9c-8491bd6ae24f-kube-api-access-tj497\") pod \"certified-operators-f8qqm\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.765473 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/993c4b20-6e07-41fc-b269-6d716c9c25c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"993c4b20-6e07-41fc-b269-6d716c9c25c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.765508 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.765530 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-utilities\") pod \"certified-operators-f8qqm\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.765621 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.765649 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-catalog-content\") pod \"certified-operators-f8qqm\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.766117 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-catalog-content\") pod \"certified-operators-f8qqm\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.766659 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:48.26645379 +0000 UTC m=+149.365951988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.779046 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-utilities\") pod \"certified-operators-f8qqm\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.823595 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj497\" (UniqueName: \"kubernetes.io/projected/16652a33-af22-4522-bd9c-8491bd6ae24f-kube-api-access-tj497\") pod \"certified-operators-f8qqm\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.823915 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ntrpq"] Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.824982 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.826051 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntrpq"] Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.867142 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.867379 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjt52\" (UniqueName: \"kubernetes.io/projected/14518714-cc42-4323-a3c9-307047368353-kube-api-access-zjt52\") pod \"certified-operators-ntrpq\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.867404 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-utilities\") pod \"certified-operators-ntrpq\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.867430 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/993c4b20-6e07-41fc-b269-6d716c9c25c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"993c4b20-6e07-41fc-b269-6d716c9c25c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.867445 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-catalog-content\") pod \"certified-operators-ntrpq\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.867467 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/993c4b20-6e07-41fc-b269-6d716c9c25c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"993c4b20-6e07-41fc-b269-6d716c9c25c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.867847 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:48.367829446 +0000 UTC m=+149.467327644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.867904 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/993c4b20-6e07-41fc-b269-6d716c9c25c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"993c4b20-6e07-41fc-b269-6d716c9c25c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.884576 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9pwb" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.920446 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jvchs" event={"ID":"658b2bd5-04cf-410b-a749-e0e67246ac4c","Type":"ContainerStarted","Data":"81eb583bf84def87ba2854272f7ef5cbacf5d07465f14fa745de402474bc141e"} Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.920487 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jvchs" event={"ID":"658b2bd5-04cf-410b-a749-e0e67246ac4c","Type":"ContainerStarted","Data":"962f911083a6c6a155e8470993c952238a6d55c13a958acb22a9ee3e5e913e8c"} Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.921577 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/993c4b20-6e07-41fc-b269-6d716c9c25c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"993c4b20-6e07-41fc-b269-6d716c9c25c7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.933137 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.952992 4807 generic.go:334] "Generic (PLEG): container finished" podID="acbc9004-f19e-419d-b609-2f9dda223b0d" containerID="070b8fbc8145631b85d09fdc49b50fcf20feaa04047883449ef385dc134ab75e" exitCode=0 Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.953754 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" event={"ID":"acbc9004-f19e-419d-b609-2f9dda223b0d","Type":"ContainerDied","Data":"070b8fbc8145631b85d09fdc49b50fcf20feaa04047883449ef385dc134ab75e"} Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.981900 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjt52\" (UniqueName: \"kubernetes.io/projected/14518714-cc42-4323-a3c9-307047368353-kube-api-access-zjt52\") pod \"certified-operators-ntrpq\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.981939 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-utilities\") pod \"certified-operators-ntrpq\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.981973 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-catalog-content\") pod \"certified-operators-ntrpq\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.982016 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.983354 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-utilities\") pod \"certified-operators-ntrpq\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:47 crc kubenswrapper[4807]: I1127 11:11:47.983813 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-catalog-content\") pod \"certified-operators-ntrpq\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:47 crc kubenswrapper[4807]: E1127 11:11:47.984024 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:48.484013332 +0000 UTC m=+149.583511530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.020527 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjt52\" (UniqueName: \"kubernetes.io/projected/14518714-cc42-4323-a3c9-307047368353-kube-api-access-zjt52\") pod \"certified-operators-ntrpq\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.047572 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.083524 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.084688 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:48.584673197 +0000 UTC m=+149.684171395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.091819 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4zn8d"] Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.173966 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.185342 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.185648 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:48.68563591 +0000 UTC m=+149.785134098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.198841 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:48 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:48 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:48 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.198900 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.201364 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmwd8"] Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.286042 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.286331 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:48.786316415 +0000 UTC m=+149.885814613 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.304650 4807 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.371457 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f8qqm"] Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.388125 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.388520 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:48.888504605 +0000 UTC m=+149.988002803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.494643 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.494973 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:48.994937421 +0000 UTC m=+150.094435619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.512775 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.577377 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ntrpq"] Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.595428 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.595746 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.095729549 +0000 UTC m=+150.195227747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: W1127 11:11:48.596716 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14518714_cc42_4323_a3c9_307047368353.slice/crio-a0099fd86f5e3dc5b03c6953de8910e1005ff39ee9c53f90967b470cb9f9e699 WatchSource:0}: Error finding container a0099fd86f5e3dc5b03c6953de8910e1005ff39ee9c53f90967b470cb9f9e699: Status 404 returned error can't find the container with id a0099fd86f5e3dc5b03c6953de8910e1005ff39ee9c53f90967b470cb9f9e699 Nov 27 11:11:48 crc kubenswrapper[4807]: W1127 11:11:48.597646 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod993c4b20_6e07_41fc_b269_6d716c9c25c7.slice/crio-fbf1516717dc882ba468739af83527a26b4521d3ef463bf1f6e53141c957fe33 WatchSource:0}: Error finding container fbf1516717dc882ba468739af83527a26b4521d3ef463bf1f6e53141c957fe33: Status 404 returned error can't find the container with id fbf1516717dc882ba468739af83527a26b4521d3ef463bf1f6e53141c957fe33 Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.697060 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.697216 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.197194248 +0000 UTC m=+150.296692466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.697484 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.697726 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.197718523 +0000 UTC m=+150.297216721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.798448 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.798595 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.298574134 +0000 UTC m=+150.398072342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.798708 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.799088 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.299078579 +0000 UTC m=+150.398576777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.900259 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.900601 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.400582698 +0000 UTC m=+150.500080896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.900779 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:48 crc kubenswrapper[4807]: E1127 11:11:48.901045 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.401033292 +0000 UTC m=+150.500531490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.961759 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d2b253cae4b1ea642aa485c635f368a511246234ece1ed1fdf1fe69c872702a9"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.961817 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a3339c0b1c93540919edacb35a1dadbdfcd9f94085c38c6a0c20cc5505f3f8f4"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.964101 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jvchs" event={"ID":"658b2bd5-04cf-410b-a749-e0e67246ac4c","Type":"ContainerStarted","Data":"6e90ee02bd6ff7d6a7858cc5981e099221fb759d61c5d1e5cbc87516bb3e3d31"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.965220 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"993c4b20-6e07-41fc-b269-6d716c9c25c7","Type":"ContainerStarted","Data":"40f83aa65264bf6b293439b092638c94eb79f906352f64e00e92ff8d33344112"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.965345 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"993c4b20-6e07-41fc-b269-6d716c9c25c7","Type":"ContainerStarted","Data":"fbf1516717dc882ba468739af83527a26b4521d3ef463bf1f6e53141c957fe33"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.970624 4807 generic.go:334] "Generic (PLEG): container finished" podID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerID="c42149efad8976068d96e9365a4d6b4c271468a3eed7f62a906b56b46d98d908" exitCode=0 Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.970932 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmwd8" event={"ID":"6fac2baf-01de-4e80-8434-4488846fd7fb","Type":"ContainerDied","Data":"c42149efad8976068d96e9365a4d6b4c271468a3eed7f62a906b56b46d98d908"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.970983 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmwd8" event={"ID":"6fac2baf-01de-4e80-8434-4488846fd7fb","Type":"ContainerStarted","Data":"e6b164bd1c2af3b650cb30bf67ccb8c9ea8397e89b05d4e31c7c374f86b9fe36"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.974781 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.980633 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a15f1c2090b905e67f479afd66618a6445e505e3caf0590f32cced628cc2998a"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.980680 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e6e2f569c4ea9b85240c672c811115ef9bcaf99f4c127fffec694937e350c69c"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.981212 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.982729 4807 generic.go:334] "Generic (PLEG): container finished" podID="7984b376-029e-465e-893e-f62f047ee418" containerID="3bc07e9f509d87b31878c2444d977e7420f6191a07d14d7fccca97ee7abc67dc" exitCode=0 Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.982781 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zn8d" event={"ID":"7984b376-029e-465e-893e-f62f047ee418","Type":"ContainerDied","Data":"3bc07e9f509d87b31878c2444d977e7420f6191a07d14d7fccca97ee7abc67dc"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.982841 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zn8d" event={"ID":"7984b376-029e-465e-893e-f62f047ee418","Type":"ContainerStarted","Data":"7c9daf59de1dde193c9dcba747459a10432e0ed84264b4c2f1c7ae043d4be4c7"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.986428 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b5513a0f6b5884661d7b21438ce014cabb7c7c7eb75f3f539a5b32b1e6825a1d"} Nov 27 11:11:48 crc kubenswrapper[4807]: I1127 11:11:48.986470 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f82ae98a0bfe4448e0ce44baadceb82d484618d5daff9e35642f3123edc8fd40"} Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.020875 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:49 crc kubenswrapper[4807]: E1127 11:11:49.020963 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.520946889 +0000 UTC m=+150.620445087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.021170 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.021343 4807 generic.go:334] "Generic (PLEG): container finished" podID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerID="06292c7cb6be8587c920949db58a1c438c570e7f5c43597a23e707700de7df08" exitCode=0 Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.021404 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8qqm" event={"ID":"16652a33-af22-4522-bd9c-8491bd6ae24f","Type":"ContainerDied","Data":"06292c7cb6be8587c920949db58a1c438c570e7f5c43597a23e707700de7df08"} Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.021436 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8qqm" event={"ID":"16652a33-af22-4522-bd9c-8491bd6ae24f","Type":"ContainerStarted","Data":"a984142023d554ae6adb3a3b84f913b88c5120ca2d4e3d28672ff1c57435ea1a"} Nov 27 11:11:49 crc kubenswrapper[4807]: E1127 11:11:49.021470 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.521457444 +0000 UTC m=+150.620955652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.024804 4807 generic.go:334] "Generic (PLEG): container finished" podID="14518714-cc42-4323-a3c9-307047368353" containerID="d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8" exitCode=0 Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.024890 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntrpq" event={"ID":"14518714-cc42-4323-a3c9-307047368353","Type":"ContainerDied","Data":"d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8"} Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.024920 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntrpq" event={"ID":"14518714-cc42-4323-a3c9-307047368353","Type":"ContainerStarted","Data":"a0099fd86f5e3dc5b03c6953de8910e1005ff39ee9c53f90967b470cb9f9e699"} Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.077712 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jvchs" podStartSLOduration=12.077691667 podStartE2EDuration="12.077691667s" podCreationTimestamp="2025-11-27 11:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:49.075991137 +0000 UTC m=+150.175489335" watchObservedRunningTime="2025-11-27 11:11:49.077691667 +0000 UTC m=+150.177189865" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.086338 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.086320624 podStartE2EDuration="2.086320624s" podCreationTimestamp="2025-11-27 11:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:49.085637124 +0000 UTC m=+150.185135322" watchObservedRunningTime="2025-11-27 11:11:49.086320624 +0000 UTC m=+150.185818822" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.122758 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:49 crc kubenswrapper[4807]: E1127 11:11:49.122882 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.622862691 +0000 UTC m=+150.722360889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.123036 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:49 crc kubenswrapper[4807]: E1127 11:11:49.124328 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-27 11:11:49.624313234 +0000 UTC m=+150.723811432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-sdd69" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.133990 4807 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-27T11:11:48.304672831Z","Handler":null,"Name":""} Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.149133 4807 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.149168 4807 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.183870 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:49 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:49 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:49 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.183916 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.224033 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.226878 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.257745 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.321049 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bl9gg"] Nov 27 11:11:49 crc kubenswrapper[4807]: E1127 11:11:49.321315 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbc9004-f19e-419d-b609-2f9dda223b0d" containerName="collect-profiles" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.321328 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbc9004-f19e-419d-b609-2f9dda223b0d" containerName="collect-profiles" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.321477 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbc9004-f19e-419d-b609-2f9dda223b0d" containerName="collect-profiles" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.322333 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.324484 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.324648 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbc9004-f19e-419d-b609-2f9dda223b0d-config-volume\") pod \"acbc9004-f19e-419d-b609-2f9dda223b0d\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.324744 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acbc9004-f19e-419d-b609-2f9dda223b0d-secret-volume\") pod \"acbc9004-f19e-419d-b609-2f9dda223b0d\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.324776 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq8gs\" (UniqueName: \"kubernetes.io/projected/acbc9004-f19e-419d-b609-2f9dda223b0d-kube-api-access-dq8gs\") pod \"acbc9004-f19e-419d-b609-2f9dda223b0d\" (UID: \"acbc9004-f19e-419d-b609-2f9dda223b0d\") " Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.324933 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.325432 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acbc9004-f19e-419d-b609-2f9dda223b0d-config-volume" (OuterVolumeSpecName: "config-volume") pod "acbc9004-f19e-419d-b609-2f9dda223b0d" (UID: "acbc9004-f19e-419d-b609-2f9dda223b0d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.326843 4807 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.326881 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.332814 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bl9gg"] Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.339842 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acbc9004-f19e-419d-b609-2f9dda223b0d-kube-api-access-dq8gs" (OuterVolumeSpecName: "kube-api-access-dq8gs") pod "acbc9004-f19e-419d-b609-2f9dda223b0d" (UID: "acbc9004-f19e-419d-b609-2f9dda223b0d"). InnerVolumeSpecName "kube-api-access-dq8gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.346825 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acbc9004-f19e-419d-b609-2f9dda223b0d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "acbc9004-f19e-419d-b609-2f9dda223b0d" (UID: "acbc9004-f19e-419d-b609-2f9dda223b0d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.367736 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-sdd69\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.425576 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzlmq\" (UniqueName: \"kubernetes.io/projected/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-kube-api-access-jzlmq\") pod \"redhat-marketplace-bl9gg\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.425653 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-catalog-content\") pod \"redhat-marketplace-bl9gg\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.425699 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-utilities\") pod \"redhat-marketplace-bl9gg\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.425743 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acbc9004-f19e-419d-b609-2f9dda223b0d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.425752 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acbc9004-f19e-419d-b609-2f9dda223b0d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.425762 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq8gs\" (UniqueName: \"kubernetes.io/projected/acbc9004-f19e-419d-b609-2f9dda223b0d-kube-api-access-dq8gs\") on node \"crc\" DevicePath \"\"" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.494373 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.526845 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-utilities\") pod \"redhat-marketplace-bl9gg\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.526922 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzlmq\" (UniqueName: \"kubernetes.io/projected/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-kube-api-access-jzlmq\") pod \"redhat-marketplace-bl9gg\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.526987 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-catalog-content\") pod \"redhat-marketplace-bl9gg\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.527401 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-utilities\") pod \"redhat-marketplace-bl9gg\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.527431 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-catalog-content\") pod \"redhat-marketplace-bl9gg\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.554394 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzlmq\" (UniqueName: \"kubernetes.io/projected/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-kube-api-access-jzlmq\") pod \"redhat-marketplace-bl9gg\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.563048 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.635590 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.733238 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t9x6p"] Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.734424 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.741316 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9x6p"] Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.752446 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sdd69"] Nov 27 11:11:49 crc kubenswrapper[4807]: W1127 11:11:49.786838 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c0378e_2da0_4e94_8230_2db66a4c7993.slice/crio-9b539da781fd854f195e52cb89a261a9a03b478705473a6cdd281f5f55f90437 WatchSource:0}: Error finding container 9b539da781fd854f195e52cb89a261a9a03b478705473a6cdd281f5f55f90437: Status 404 returned error can't find the container with id 9b539da781fd854f195e52cb89a261a9a03b478705473a6cdd281f5f55f90437 Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.851586 4807 patch_prober.go:28] interesting pod/downloads-7954f5f757-jbssf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.851685 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jbssf" podUID="3daa1b2f-7da1-475f-8807-299bcf8423ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.852129 4807 patch_prober.go:28] interesting pod/downloads-7954f5f757-jbssf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.852186 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jbssf" podUID="3daa1b2f-7da1-475f-8807-299bcf8423ca" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.853984 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-utilities\") pod \"redhat-marketplace-t9x6p\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.854030 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-catalog-content\") pod \"redhat-marketplace-t9x6p\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.854091 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdlqt\" (UniqueName: \"kubernetes.io/projected/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-kube-api-access-gdlqt\") pod \"redhat-marketplace-t9x6p\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.955459 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-utilities\") pod \"redhat-marketplace-t9x6p\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.955811 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-catalog-content\") pod \"redhat-marketplace-t9x6p\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.955846 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdlqt\" (UniqueName: \"kubernetes.io/projected/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-kube-api-access-gdlqt\") pod \"redhat-marketplace-t9x6p\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.956057 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-utilities\") pod \"redhat-marketplace-t9x6p\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.956362 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-catalog-content\") pod \"redhat-marketplace-t9x6p\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:49 crc kubenswrapper[4807]: I1127 11:11:49.989533 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdlqt\" (UniqueName: \"kubernetes.io/projected/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-kube-api-access-gdlqt\") pod \"redhat-marketplace-t9x6p\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.040058 4807 generic.go:334] "Generic (PLEG): container finished" podID="993c4b20-6e07-41fc-b269-6d716c9c25c7" containerID="40f83aa65264bf6b293439b092638c94eb79f906352f64e00e92ff8d33344112" exitCode=0 Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.040171 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"993c4b20-6e07-41fc-b269-6d716c9c25c7","Type":"ContainerDied","Data":"40f83aa65264bf6b293439b092638c94eb79f906352f64e00e92ff8d33344112"} Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.042418 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" event={"ID":"acbc9004-f19e-419d-b609-2f9dda223b0d","Type":"ContainerDied","Data":"4803e5ec4a08da5fc2c4b304155ef2a849fca47f1c5fcd31e30c31f881e57487"} Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.042470 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4803e5ec4a08da5fc2c4b304155ef2a849fca47f1c5fcd31e30c31f881e57487" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.042486 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.052414 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" event={"ID":"35c0378e-2da0-4e94-8230-2db66a4c7993","Type":"ContainerStarted","Data":"1517724e1d08cf6ea9812b2f939807a259e901352ff3949c2dd4795c3a029eeb"} Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.052448 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" event={"ID":"35c0378e-2da0-4e94-8230-2db66a4c7993","Type":"ContainerStarted","Data":"9b539da781fd854f195e52cb89a261a9a03b478705473a6cdd281f5f55f90437"} Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.076988 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.077024 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" podStartSLOduration=132.077004536 podStartE2EDuration="2m12.077004536s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:50.076957745 +0000 UTC m=+151.176455943" watchObservedRunningTime="2025-11-27 11:11:50.077004536 +0000 UTC m=+151.176502734" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.127278 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bl9gg"] Nov 27 11:11:50 crc kubenswrapper[4807]: W1127 11:11:50.148965 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a8d58fc_68ed_46cd_bd2d_23ac69cac2b5.slice/crio-db1fa9aee0ab1fa3ef2464d0f517d57852faea54fc76c75135759423009fd564 WatchSource:0}: Error finding container db1fa9aee0ab1fa3ef2464d0f517d57852faea54fc76c75135759423009fd564: Status 404 returned error can't find the container with id db1fa9aee0ab1fa3ef2464d0f517d57852faea54fc76c75135759423009fd564 Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.188615 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:50 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:50 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:50 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.188688 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.336400 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vhk66"] Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.346761 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhk66"] Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.346873 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.348378 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.461818 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-catalog-content\") pod \"redhat-operators-vhk66\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.461884 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-utilities\") pod \"redhat-operators-vhk66\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.461920 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hkt6\" (UniqueName: \"kubernetes.io/projected/c61dff77-8482-4e06-b99e-72c1cd18c4ca-kube-api-access-9hkt6\") pod \"redhat-operators-vhk66\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.542452 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9x6p"] Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.562819 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hkt6\" (UniqueName: \"kubernetes.io/projected/c61dff77-8482-4e06-b99e-72c1cd18c4ca-kube-api-access-9hkt6\") pod \"redhat-operators-vhk66\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.562946 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-catalog-content\") pod \"redhat-operators-vhk66\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.562995 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-utilities\") pod \"redhat-operators-vhk66\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: W1127 11:11:50.563067 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1df798f7_f40a_41b8_b4fb_1d6cf4e05f52.slice/crio-a0e98ef7a955da5902830d51bb1626c46e361fd93af849d6e70ad6ab1a482e0a WatchSource:0}: Error finding container a0e98ef7a955da5902830d51bb1626c46e361fd93af849d6e70ad6ab1a482e0a: Status 404 returned error can't find the container with id a0e98ef7a955da5902830d51bb1626c46e361fd93af849d6e70ad6ab1a482e0a Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.564087 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-utilities\") pod \"redhat-operators-vhk66\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.564161 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-catalog-content\") pod \"redhat-operators-vhk66\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.582617 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hkt6\" (UniqueName: \"kubernetes.io/projected/c61dff77-8482-4e06-b99e-72c1cd18c4ca-kube-api-access-9hkt6\") pod \"redhat-operators-vhk66\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.664514 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.723665 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p7s6f"] Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.725903 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.733573 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7s6f"] Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.871206 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-utilities\") pod \"redhat-operators-p7s6f\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.871332 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxq2n\" (UniqueName: \"kubernetes.io/projected/54e639c6-6aad-4df0-b118-cd50f488a026-kube-api-access-bxq2n\") pod \"redhat-operators-p7s6f\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.871356 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-catalog-content\") pod \"redhat-operators-p7s6f\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.923020 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.923078 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.972555 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxq2n\" (UniqueName: \"kubernetes.io/projected/54e639c6-6aad-4df0-b118-cd50f488a026-kube-api-access-bxq2n\") pod \"redhat-operators-p7s6f\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.972601 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-catalog-content\") pod \"redhat-operators-p7s6f\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.972625 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-utilities\") pod \"redhat-operators-p7s6f\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.973175 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-utilities\") pod \"redhat-operators-p7s6f\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.973689 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-catalog-content\") pod \"redhat-operators-p7s6f\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:50 crc kubenswrapper[4807]: I1127 11:11:50.990100 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxq2n\" (UniqueName: \"kubernetes.io/projected/54e639c6-6aad-4df0-b118-cd50f488a026-kube-api-access-bxq2n\") pod \"redhat-operators-p7s6f\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.089802 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.095715 4807 generic.go:334] "Generic (PLEG): container finished" podID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerID="df88f9a6750fc797ea8b621d3e3fa8b3f79b8c37b54a21e9efdcd52eecfeb675" exitCode=0 Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.097657 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9gg" event={"ID":"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5","Type":"ContainerDied","Data":"df88f9a6750fc797ea8b621d3e3fa8b3f79b8c37b54a21e9efdcd52eecfeb675"} Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.097684 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9gg" event={"ID":"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5","Type":"ContainerStarted","Data":"db1fa9aee0ab1fa3ef2464d0f517d57852faea54fc76c75135759423009fd564"} Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.112915 4807 generic.go:334] "Generic (PLEG): container finished" podID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerID="a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7" exitCode=0 Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.114419 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9x6p" event={"ID":"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52","Type":"ContainerDied","Data":"a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7"} Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.114455 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.114470 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9x6p" event={"ID":"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52","Type":"ContainerStarted","Data":"a0e98ef7a955da5902830d51bb1626c46e361fd93af849d6e70ad6ab1a482e0a"} Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.128728 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vhk66"] Nov 27 11:11:51 crc kubenswrapper[4807]: W1127 11:11:51.142453 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc61dff77_8482_4e06_b99e_72c1cd18c4ca.slice/crio-d778404a16cf691295fa1c89bcf6f82b237523bf2078b067653d0b587a01a772 WatchSource:0}: Error finding container d778404a16cf691295fa1c89bcf6f82b237523bf2078b067653d0b587a01a772: Status 404 returned error can't find the container with id d778404a16cf691295fa1c89bcf6f82b237523bf2078b067653d0b587a01a772 Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.181409 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.184059 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:51 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:51 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:51 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.184093 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.457758 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.490010 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.490366 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.490956 4807 patch_prober.go:28] interesting pod/console-f9d7485db-jdsqc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.490998 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jdsqc" podUID="93c49e07-08ef-4b31-abb3-787a46a3fbfd" containerName="console" probeResult="failure" output="Get \"https://10.217.0.26:8443/health\": dial tcp 10.217.0.26:8443: connect: connection refused" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.552165 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.556360 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.557456 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wk6hj" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.572441 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.584354 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/993c4b20-6e07-41fc-b269-6d716c9c25c7-kubelet-dir\") pod \"993c4b20-6e07-41fc-b269-6d716c9c25c7\" (UID: \"993c4b20-6e07-41fc-b269-6d716c9c25c7\") " Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.584417 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/993c4b20-6e07-41fc-b269-6d716c9c25c7-kube-api-access\") pod \"993c4b20-6e07-41fc-b269-6d716c9c25c7\" (UID: \"993c4b20-6e07-41fc-b269-6d716c9c25c7\") " Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.585484 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/993c4b20-6e07-41fc-b269-6d716c9c25c7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "993c4b20-6e07-41fc-b269-6d716c9c25c7" (UID: "993c4b20-6e07-41fc-b269-6d716c9c25c7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.591422 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/993c4b20-6e07-41fc-b269-6d716c9c25c7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "993c4b20-6e07-41fc-b269-6d716c9c25c7" (UID: "993c4b20-6e07-41fc-b269-6d716c9c25c7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.674040 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p7s6f"] Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.688057 4807 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/993c4b20-6e07-41fc-b269-6d716c9c25c7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.688083 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/993c4b20-6e07-41fc-b269-6d716c9c25c7-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 11:11:51 crc kubenswrapper[4807]: W1127 11:11:51.706792 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54e639c6_6aad_4df0_b118_cd50f488a026.slice/crio-77f592610ef2449b256cd7c075a1f58d68d487fa8fc48806f74e7d5ffbd837f2 WatchSource:0}: Error finding container 77f592610ef2449b256cd7c075a1f58d68d487fa8fc48806f74e7d5ffbd837f2: Status 404 returned error can't find the container with id 77f592610ef2449b256cd7c075a1f58d68d487fa8fc48806f74e7d5ffbd837f2 Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.754898 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 27 11:11:51 crc kubenswrapper[4807]: E1127 11:11:51.755129 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993c4b20-6e07-41fc-b269-6d716c9c25c7" containerName="pruner" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.755147 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="993c4b20-6e07-41fc-b269-6d716c9c25c7" containerName="pruner" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.755288 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="993c4b20-6e07-41fc-b269-6d716c9c25c7" containerName="pruner" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.755671 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.758940 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.759160 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.759694 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.835743 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.835786 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.845675 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.891141 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"605c0c71-6929-4ede-8a49-22bfd1d9a8ac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.891337 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"605c0c71-6929-4ede-8a49-22bfd1d9a8ac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.992645 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"605c0c71-6929-4ede-8a49-22bfd1d9a8ac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.992701 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"605c0c71-6929-4ede-8a49-22bfd1d9a8ac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 11:11:51 crc kubenswrapper[4807]: I1127 11:11:51.992825 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"605c0c71-6929-4ede-8a49-22bfd1d9a8ac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.015649 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"605c0c71-6929-4ede-8a49-22bfd1d9a8ac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.088120 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.151577 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7s6f" event={"ID":"54e639c6-6aad-4df0-b118-cd50f488a026","Type":"ContainerStarted","Data":"77f592610ef2449b256cd7c075a1f58d68d487fa8fc48806f74e7d5ffbd837f2"} Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.185435 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:52 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:52 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:52 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.185496 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.225510 4807 generic.go:334] "Generic (PLEG): container finished" podID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerID="1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f" exitCode=0 Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.225595 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhk66" event={"ID":"c61dff77-8482-4e06-b99e-72c1cd18c4ca","Type":"ContainerDied","Data":"1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f"} Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.225623 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhk66" event={"ID":"c61dff77-8482-4e06-b99e-72c1cd18c4ca","Type":"ContainerStarted","Data":"d778404a16cf691295fa1c89bcf6f82b237523bf2078b067653d0b587a01a772"} Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.251608 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.254350 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"993c4b20-6e07-41fc-b269-6d716c9c25c7","Type":"ContainerDied","Data":"fbf1516717dc882ba468739af83527a26b4521d3ef463bf1f6e53141c957fe33"} Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.254386 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbf1516717dc882ba468739af83527a26b4521d3ef463bf1f6e53141c957fe33" Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.267325 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hkmnz" Nov 27 11:11:52 crc kubenswrapper[4807]: I1127 11:11:52.712857 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 27 11:11:53 crc kubenswrapper[4807]: I1127 11:11:53.185262 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:53 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:53 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:53 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:53 crc kubenswrapper[4807]: I1127 11:11:53.185324 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:53 crc kubenswrapper[4807]: I1127 11:11:53.273267 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"605c0c71-6929-4ede-8a49-22bfd1d9a8ac","Type":"ContainerStarted","Data":"bde91872f8c4f2089bec2c2fcc284ae217c72b59dc2225b4f5a7eb8f754f3349"} Nov 27 11:11:53 crc kubenswrapper[4807]: I1127 11:11:53.273311 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"605c0c71-6929-4ede-8a49-22bfd1d9a8ac","Type":"ContainerStarted","Data":"aaaa956d6d26fcb87ce0043ad38816e75df1d67be611da150d98b32ca795cc01"} Nov 27 11:11:53 crc kubenswrapper[4807]: I1127 11:11:53.277108 4807 generic.go:334] "Generic (PLEG): container finished" podID="54e639c6-6aad-4df0-b118-cd50f488a026" containerID="e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b" exitCode=0 Nov 27 11:11:53 crc kubenswrapper[4807]: I1127 11:11:53.277195 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7s6f" event={"ID":"54e639c6-6aad-4df0-b118-cd50f488a026","Type":"ContainerDied","Data":"e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b"} Nov 27 11:11:53 crc kubenswrapper[4807]: I1127 11:11:53.302830 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.30281467 podStartE2EDuration="2.30281467s" podCreationTimestamp="2025-11-27 11:11:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:11:53.300893113 +0000 UTC m=+154.400391311" watchObservedRunningTime="2025-11-27 11:11:53.30281467 +0000 UTC m=+154.402312868" Nov 27 11:11:53 crc kubenswrapper[4807]: I1127 11:11:53.624544 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rqc9w" Nov 27 11:11:54 crc kubenswrapper[4807]: I1127 11:11:54.183584 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:54 crc kubenswrapper[4807]: [-]has-synced failed: reason withheld Nov 27 11:11:54 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:54 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:54 crc kubenswrapper[4807]: I1127 11:11:54.183640 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:54 crc kubenswrapper[4807]: I1127 11:11:54.291277 4807 generic.go:334] "Generic (PLEG): container finished" podID="605c0c71-6929-4ede-8a49-22bfd1d9a8ac" containerID="bde91872f8c4f2089bec2c2fcc284ae217c72b59dc2225b4f5a7eb8f754f3349" exitCode=0 Nov 27 11:11:54 crc kubenswrapper[4807]: I1127 11:11:54.291347 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"605c0c71-6929-4ede-8a49-22bfd1d9a8ac","Type":"ContainerDied","Data":"bde91872f8c4f2089bec2c2fcc284ae217c72b59dc2225b4f5a7eb8f754f3349"} Nov 27 11:11:55 crc kubenswrapper[4807]: I1127 11:11:55.187126 4807 patch_prober.go:28] interesting pod/router-default-5444994796-7kpfm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 27 11:11:55 crc kubenswrapper[4807]: [+]has-synced ok Nov 27 11:11:55 crc kubenswrapper[4807]: [+]process-running ok Nov 27 11:11:55 crc kubenswrapper[4807]: healthz check failed Nov 27 11:11:55 crc kubenswrapper[4807]: I1127 11:11:55.187180 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7kpfm" podUID="0a7ac40a-0ecf-482a-9353-e6c71787da7e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 27 11:11:56 crc kubenswrapper[4807]: I1127 11:11:56.184927 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:56 crc kubenswrapper[4807]: I1127 11:11:56.187042 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-7kpfm" Nov 27 11:11:59 crc kubenswrapper[4807]: I1127 11:11:59.855703 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jbssf" Nov 27 11:12:00 crc kubenswrapper[4807]: I1127 11:12:00.446066 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:12:00 crc kubenswrapper[4807]: I1127 11:12:00.470196 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/911bce2f-3fb2-484d-870f-d9737047bd10-metrics-certs\") pod \"network-metrics-daemon-wszmz\" (UID: \"911bce2f-3fb2-484d-870f-d9737047bd10\") " pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:12:00 crc kubenswrapper[4807]: I1127 11:12:00.655296 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wszmz" Nov 27 11:12:01 crc kubenswrapper[4807]: I1127 11:12:01.494178 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:12:01 crc kubenswrapper[4807]: I1127 11:12:01.499464 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:12:04 crc kubenswrapper[4807]: I1127 11:12:04.150087 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 11:12:04 crc kubenswrapper[4807]: I1127 11:12:04.191424 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kube-api-access\") pod \"605c0c71-6929-4ede-8a49-22bfd1d9a8ac\" (UID: \"605c0c71-6929-4ede-8a49-22bfd1d9a8ac\") " Nov 27 11:12:04 crc kubenswrapper[4807]: I1127 11:12:04.191542 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kubelet-dir\") pod \"605c0c71-6929-4ede-8a49-22bfd1d9a8ac\" (UID: \"605c0c71-6929-4ede-8a49-22bfd1d9a8ac\") " Nov 27 11:12:04 crc kubenswrapper[4807]: I1127 11:12:04.191667 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "605c0c71-6929-4ede-8a49-22bfd1d9a8ac" (UID: "605c0c71-6929-4ede-8a49-22bfd1d9a8ac"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:12:04 crc kubenswrapper[4807]: I1127 11:12:04.192115 4807 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:04 crc kubenswrapper[4807]: I1127 11:12:04.195925 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "605c0c71-6929-4ede-8a49-22bfd1d9a8ac" (UID: "605c0c71-6929-4ede-8a49-22bfd1d9a8ac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:12:04 crc kubenswrapper[4807]: I1127 11:12:04.292883 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/605c0c71-6929-4ede-8a49-22bfd1d9a8ac-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:04 crc kubenswrapper[4807]: I1127 11:12:04.352881 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"605c0c71-6929-4ede-8a49-22bfd1d9a8ac","Type":"ContainerDied","Data":"aaaa956d6d26fcb87ce0043ad38816e75df1d67be611da150d98b32ca795cc01"} Nov 27 11:12:04 crc kubenswrapper[4807]: I1127 11:12:04.352923 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaaa956d6d26fcb87ce0043ad38816e75df1d67be611da150d98b32ca795cc01" Nov 27 11:12:04 crc kubenswrapper[4807]: I1127 11:12:04.352925 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 27 11:12:09 crc kubenswrapper[4807]: I1127 11:12:09.502803 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:12:14 crc kubenswrapper[4807]: E1127 11:12:14.985915 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 27 11:12:14 crc kubenswrapper[4807]: E1127 11:12:14.986342 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5qxg4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-tmwd8_openshift-marketplace(6fac2baf-01de-4e80-8434-4488846fd7fb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 11:12:14 crc kubenswrapper[4807]: E1127 11:12:14.987512 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-tmwd8" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" Nov 27 11:12:19 crc kubenswrapper[4807]: E1127 11:12:19.084124 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-tmwd8" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" Nov 27 11:12:19 crc kubenswrapper[4807]: E1127 11:12:19.740723 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 27 11:12:19 crc kubenswrapper[4807]: E1127 11:12:19.741580 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9hkt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-vhk66_openshift-marketplace(c61dff77-8482-4e06-b99e-72c1cd18c4ca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 11:12:19 crc kubenswrapper[4807]: E1127 11:12:19.742776 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-vhk66" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" Nov 27 11:12:20 crc kubenswrapper[4807]: I1127 11:12:20.921389 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:12:20 crc kubenswrapper[4807]: I1127 11:12:20.921767 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:12:21 crc kubenswrapper[4807]: E1127 11:12:21.811625 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-vhk66" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" Nov 27 11:12:21 crc kubenswrapper[4807]: E1127 11:12:21.867588 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 27 11:12:21 crc kubenswrapper[4807]: E1127 11:12:21.867754 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gdlqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-t9x6p_openshift-marketplace(1df798f7-f40a-41b8-b4fb-1d6cf4e05f52): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 11:12:21 crc kubenswrapper[4807]: E1127 11:12:21.868956 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-t9x6p" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" Nov 27 11:12:21 crc kubenswrapper[4807]: I1127 11:12:21.906848 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2pt99" Nov 27 11:12:21 crc kubenswrapper[4807]: E1127 11:12:21.910307 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Nov 27 11:12:21 crc kubenswrapper[4807]: E1127 11:12:21.910433 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jzlmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bl9gg_openshift-marketplace(4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 11:12:21 crc kubenswrapper[4807]: E1127 11:12:21.911635 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bl9gg" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" Nov 27 11:12:23 crc kubenswrapper[4807]: E1127 11:12:23.153424 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-t9x6p" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" Nov 27 11:12:23 crc kubenswrapper[4807]: E1127 11:12:23.153971 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bl9gg" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" Nov 27 11:12:23 crc kubenswrapper[4807]: E1127 11:12:23.251744 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 27 11:12:23 crc kubenswrapper[4807]: E1127 11:12:23.252152 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tj497,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-f8qqm_openshift-marketplace(16652a33-af22-4522-bd9c-8491bd6ae24f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 11:12:23 crc kubenswrapper[4807]: E1127 11:12:23.253629 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-f8qqm" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" Nov 27 11:12:23 crc kubenswrapper[4807]: E1127 11:12:23.266405 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 27 11:12:23 crc kubenswrapper[4807]: E1127 11:12:23.266577 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zjt52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ntrpq_openshift-marketplace(14518714-cc42-4323-a3c9-307047368353): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 11:12:23 crc kubenswrapper[4807]: E1127 11:12:23.267729 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ntrpq" podUID="14518714-cc42-4323-a3c9-307047368353" Nov 27 11:12:23 crc kubenswrapper[4807]: I1127 11:12:23.452272 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7s6f" event={"ID":"54e639c6-6aad-4df0-b118-cd50f488a026","Type":"ContainerStarted","Data":"b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0"} Nov 27 11:12:23 crc kubenswrapper[4807]: I1127 11:12:23.454603 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zn8d" event={"ID":"7984b376-029e-465e-893e-f62f047ee418","Type":"ContainerStarted","Data":"71b5b4c6e1fdf96acfbcc0bec52eb31b3f0f5ef4c5d4db9eb5f845da789deb78"} Nov 27 11:12:23 crc kubenswrapper[4807]: E1127 11:12:23.456072 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-f8qqm" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" Nov 27 11:12:23 crc kubenswrapper[4807]: E1127 11:12:23.456447 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ntrpq" podUID="14518714-cc42-4323-a3c9-307047368353" Nov 27 11:12:23 crc kubenswrapper[4807]: I1127 11:12:23.565441 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wszmz"] Nov 27 11:12:23 crc kubenswrapper[4807]: W1127 11:12:23.567389 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod911bce2f_3fb2_484d_870f_d9737047bd10.slice/crio-350ed51b8f7f93d026fc28aeb20f3652ef060f5795c11aae6b9b603585472076 WatchSource:0}: Error finding container 350ed51b8f7f93d026fc28aeb20f3652ef060f5795c11aae6b9b603585472076: Status 404 returned error can't find the container with id 350ed51b8f7f93d026fc28aeb20f3652ef060f5795c11aae6b9b603585472076 Nov 27 11:12:24 crc kubenswrapper[4807]: I1127 11:12:24.464415 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wszmz" event={"ID":"911bce2f-3fb2-484d-870f-d9737047bd10","Type":"ContainerStarted","Data":"c7dc1302c34cc8cbe22b114f9ea00dd310029aa2e33f473147e5f39e97f8b5f4"} Nov 27 11:12:24 crc kubenswrapper[4807]: I1127 11:12:24.464997 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wszmz" event={"ID":"911bce2f-3fb2-484d-870f-d9737047bd10","Type":"ContainerStarted","Data":"1b723a1f4e14b94027026ee3e9513014a6bbdc832b9def45df56f34d848dadaf"} Nov 27 11:12:24 crc kubenswrapper[4807]: I1127 11:12:24.465018 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wszmz" event={"ID":"911bce2f-3fb2-484d-870f-d9737047bd10","Type":"ContainerStarted","Data":"350ed51b8f7f93d026fc28aeb20f3652ef060f5795c11aae6b9b603585472076"} Nov 27 11:12:24 crc kubenswrapper[4807]: I1127 11:12:24.470303 4807 generic.go:334] "Generic (PLEG): container finished" podID="7984b376-029e-465e-893e-f62f047ee418" containerID="71b5b4c6e1fdf96acfbcc0bec52eb31b3f0f5ef4c5d4db9eb5f845da789deb78" exitCode=0 Nov 27 11:12:24 crc kubenswrapper[4807]: I1127 11:12:24.470490 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zn8d" event={"ID":"7984b376-029e-465e-893e-f62f047ee418","Type":"ContainerDied","Data":"71b5b4c6e1fdf96acfbcc0bec52eb31b3f0f5ef4c5d4db9eb5f845da789deb78"} Nov 27 11:12:24 crc kubenswrapper[4807]: I1127 11:12:24.475313 4807 generic.go:334] "Generic (PLEG): container finished" podID="54e639c6-6aad-4df0-b118-cd50f488a026" containerID="b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0" exitCode=0 Nov 27 11:12:24 crc kubenswrapper[4807]: I1127 11:12:24.475433 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7s6f" event={"ID":"54e639c6-6aad-4df0-b118-cd50f488a026","Type":"ContainerDied","Data":"b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0"} Nov 27 11:12:24 crc kubenswrapper[4807]: I1127 11:12:24.492849 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wszmz" podStartSLOduration=166.492819198 podStartE2EDuration="2m46.492819198s" podCreationTimestamp="2025-11-27 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:12:24.490578632 +0000 UTC m=+185.590076890" watchObservedRunningTime="2025-11-27 11:12:24.492819198 +0000 UTC m=+185.592317446" Nov 27 11:12:25 crc kubenswrapper[4807]: I1127 11:12:25.483536 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7s6f" event={"ID":"54e639c6-6aad-4df0-b118-cd50f488a026","Type":"ContainerStarted","Data":"7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464"} Nov 27 11:12:25 crc kubenswrapper[4807]: I1127 11:12:25.485813 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zn8d" event={"ID":"7984b376-029e-465e-893e-f62f047ee418","Type":"ContainerStarted","Data":"43436aee429a3e83d476f833f00c7d06382a1694db5d0a1b763cda166725e3d3"} Nov 27 11:12:25 crc kubenswrapper[4807]: I1127 11:12:25.517434 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p7s6f" podStartSLOduration=3.475302301 podStartE2EDuration="35.517415569s" podCreationTimestamp="2025-11-27 11:11:50 +0000 UTC" firstStartedPulling="2025-11-27 11:11:53.281540887 +0000 UTC m=+154.381039085" lastFinishedPulling="2025-11-27 11:12:25.323654155 +0000 UTC m=+186.423152353" observedRunningTime="2025-11-27 11:12:25.515176303 +0000 UTC m=+186.614674501" watchObservedRunningTime="2025-11-27 11:12:25.517415569 +0000 UTC m=+186.616913767" Nov 27 11:12:25 crc kubenswrapper[4807]: I1127 11:12:25.532680 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4zn8d" podStartSLOduration=3.498815686 podStartE2EDuration="39.532661053s" podCreationTimestamp="2025-11-27 11:11:46 +0000 UTC" firstStartedPulling="2025-11-27 11:11:48.984062652 +0000 UTC m=+150.083560860" lastFinishedPulling="2025-11-27 11:12:25.017908029 +0000 UTC m=+186.117406227" observedRunningTime="2025-11-27 11:12:25.528411196 +0000 UTC m=+186.627909394" watchObservedRunningTime="2025-11-27 11:12:25.532661053 +0000 UTC m=+186.632159251" Nov 27 11:12:27 crc kubenswrapper[4807]: I1127 11:12:27.167933 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 27 11:12:27 crc kubenswrapper[4807]: I1127 11:12:27.280043 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:12:27 crc kubenswrapper[4807]: I1127 11:12:27.280091 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:12:28 crc kubenswrapper[4807]: I1127 11:12:28.478387 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4zn8d" podUID="7984b376-029e-465e-893e-f62f047ee418" containerName="registry-server" probeResult="failure" output=< Nov 27 11:12:28 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Nov 27 11:12:28 crc kubenswrapper[4807]: > Nov 27 11:12:28 crc kubenswrapper[4807]: I1127 11:12:28.574822 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qsdql"] Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.572809 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 27 11:12:30 crc kubenswrapper[4807]: E1127 11:12:30.573322 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="605c0c71-6929-4ede-8a49-22bfd1d9a8ac" containerName="pruner" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.573335 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="605c0c71-6929-4ede-8a49-22bfd1d9a8ac" containerName="pruner" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.573450 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="605c0c71-6929-4ede-8a49-22bfd1d9a8ac" containerName="pruner" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.573814 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.575539 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.577845 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.581789 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.728453 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/861b7ecf-e258-42a7-a4e8-0319e820dc70-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"861b7ecf-e258-42a7-a4e8-0319e820dc70\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.728606 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/861b7ecf-e258-42a7-a4e8-0319e820dc70-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"861b7ecf-e258-42a7-a4e8-0319e820dc70\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.830589 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/861b7ecf-e258-42a7-a4e8-0319e820dc70-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"861b7ecf-e258-42a7-a4e8-0319e820dc70\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.830905 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/861b7ecf-e258-42a7-a4e8-0319e820dc70-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"861b7ecf-e258-42a7-a4e8-0319e820dc70\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.833797 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/861b7ecf-e258-42a7-a4e8-0319e820dc70-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"861b7ecf-e258-42a7-a4e8-0319e820dc70\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.853821 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/861b7ecf-e258-42a7-a4e8-0319e820dc70-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"861b7ecf-e258-42a7-a4e8-0319e820dc70\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 11:12:30 crc kubenswrapper[4807]: I1127 11:12:30.893135 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 11:12:31 crc kubenswrapper[4807]: I1127 11:12:31.091468 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:12:31 crc kubenswrapper[4807]: I1127 11:12:31.091735 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:12:31 crc kubenswrapper[4807]: I1127 11:12:31.129980 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:12:31 crc kubenswrapper[4807]: I1127 11:12:31.270676 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Nov 27 11:12:31 crc kubenswrapper[4807]: I1127 11:12:31.514420 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"861b7ecf-e258-42a7-a4e8-0319e820dc70","Type":"ContainerStarted","Data":"b642056b22c9159d86c86eb36c7972440fa9998dfb78ef37781e969f7a88b444"} Nov 27 11:12:31 crc kubenswrapper[4807]: I1127 11:12:31.561760 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:12:31 crc kubenswrapper[4807]: I1127 11:12:31.596028 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7s6f"] Nov 27 11:12:32 crc kubenswrapper[4807]: I1127 11:12:32.520101 4807 generic.go:334] "Generic (PLEG): container finished" podID="861b7ecf-e258-42a7-a4e8-0319e820dc70" containerID="e4266f74f374724e8a182ac8fb5efd335270bde2c4f085b1425946f970f5f5d6" exitCode=0 Nov 27 11:12:32 crc kubenswrapper[4807]: I1127 11:12:32.520383 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"861b7ecf-e258-42a7-a4e8-0319e820dc70","Type":"ContainerDied","Data":"e4266f74f374724e8a182ac8fb5efd335270bde2c4f085b1425946f970f5f5d6"} Nov 27 11:12:33 crc kubenswrapper[4807]: I1127 11:12:33.525046 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p7s6f" podUID="54e639c6-6aad-4df0-b118-cd50f488a026" containerName="registry-server" containerID="cri-o://7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464" gracePeriod=2 Nov 27 11:12:33 crc kubenswrapper[4807]: I1127 11:12:33.763878 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 11:12:33 crc kubenswrapper[4807]: I1127 11:12:33.869775 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/861b7ecf-e258-42a7-a4e8-0319e820dc70-kubelet-dir\") pod \"861b7ecf-e258-42a7-a4e8-0319e820dc70\" (UID: \"861b7ecf-e258-42a7-a4e8-0319e820dc70\") " Nov 27 11:12:33 crc kubenswrapper[4807]: I1127 11:12:33.869858 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/861b7ecf-e258-42a7-a4e8-0319e820dc70-kube-api-access\") pod \"861b7ecf-e258-42a7-a4e8-0319e820dc70\" (UID: \"861b7ecf-e258-42a7-a4e8-0319e820dc70\") " Nov 27 11:12:33 crc kubenswrapper[4807]: I1127 11:12:33.871095 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/861b7ecf-e258-42a7-a4e8-0319e820dc70-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "861b7ecf-e258-42a7-a4e8-0319e820dc70" (UID: "861b7ecf-e258-42a7-a4e8-0319e820dc70"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:12:33 crc kubenswrapper[4807]: I1127 11:12:33.881188 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861b7ecf-e258-42a7-a4e8-0319e820dc70-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "861b7ecf-e258-42a7-a4e8-0319e820dc70" (UID: "861b7ecf-e258-42a7-a4e8-0319e820dc70"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:12:33 crc kubenswrapper[4807]: I1127 11:12:33.924757 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:12:33 crc kubenswrapper[4807]: I1127 11:12:33.970835 4807 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/861b7ecf-e258-42a7-a4e8-0319e820dc70-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:33 crc kubenswrapper[4807]: I1127 11:12:33.970858 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/861b7ecf-e258-42a7-a4e8-0319e820dc70-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.071440 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-utilities\") pod \"54e639c6-6aad-4df0-b118-cd50f488a026\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.071597 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxq2n\" (UniqueName: \"kubernetes.io/projected/54e639c6-6aad-4df0-b118-cd50f488a026-kube-api-access-bxq2n\") pod \"54e639c6-6aad-4df0-b118-cd50f488a026\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.071652 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-catalog-content\") pod \"54e639c6-6aad-4df0-b118-cd50f488a026\" (UID: \"54e639c6-6aad-4df0-b118-cd50f488a026\") " Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.072485 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-utilities" (OuterVolumeSpecName: "utilities") pod "54e639c6-6aad-4df0-b118-cd50f488a026" (UID: "54e639c6-6aad-4df0-b118-cd50f488a026"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.074217 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e639c6-6aad-4df0-b118-cd50f488a026-kube-api-access-bxq2n" (OuterVolumeSpecName: "kube-api-access-bxq2n") pod "54e639c6-6aad-4df0-b118-cd50f488a026" (UID: "54e639c6-6aad-4df0-b118-cd50f488a026"). InnerVolumeSpecName "kube-api-access-bxq2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.172681 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.172712 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxq2n\" (UniqueName: \"kubernetes.io/projected/54e639c6-6aad-4df0-b118-cd50f488a026-kube-api-access-bxq2n\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.185490 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54e639c6-6aad-4df0-b118-cd50f488a026" (UID: "54e639c6-6aad-4df0-b118-cd50f488a026"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.273891 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54e639c6-6aad-4df0-b118-cd50f488a026-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.531360 4807 generic.go:334] "Generic (PLEG): container finished" podID="54e639c6-6aad-4df0-b118-cd50f488a026" containerID="7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464" exitCode=0 Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.531415 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p7s6f" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.531427 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7s6f" event={"ID":"54e639c6-6aad-4df0-b118-cd50f488a026","Type":"ContainerDied","Data":"7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464"} Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.531451 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p7s6f" event={"ID":"54e639c6-6aad-4df0-b118-cd50f488a026","Type":"ContainerDied","Data":"77f592610ef2449b256cd7c075a1f58d68d487fa8fc48806f74e7d5ffbd837f2"} Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.531467 4807 scope.go:117] "RemoveContainer" containerID="7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.532706 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.532691 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"861b7ecf-e258-42a7-a4e8-0319e820dc70","Type":"ContainerDied","Data":"b642056b22c9159d86c86eb36c7972440fa9998dfb78ef37781e969f7a88b444"} Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.532813 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b642056b22c9159d86c86eb36c7972440fa9998dfb78ef37781e969f7a88b444" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.550506 4807 scope.go:117] "RemoveContainer" containerID="b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.567565 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p7s6f"] Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.570015 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p7s6f"] Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.571405 4807 scope.go:117] "RemoveContainer" containerID="e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.597158 4807 scope.go:117] "RemoveContainer" containerID="7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464" Nov 27 11:12:34 crc kubenswrapper[4807]: E1127 11:12:34.597903 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464\": container with ID starting with 7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464 not found: ID does not exist" containerID="7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.597931 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464"} err="failed to get container status \"7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464\": rpc error: code = NotFound desc = could not find container \"7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464\": container with ID starting with 7afff3ff13d183a96c47bbff27e5f29594b25d9a7f09bba7e77745c24dbea464 not found: ID does not exist" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.598039 4807 scope.go:117] "RemoveContainer" containerID="b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0" Nov 27 11:12:34 crc kubenswrapper[4807]: E1127 11:12:34.598367 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0\": container with ID starting with b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0 not found: ID does not exist" containerID="b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.598388 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0"} err="failed to get container status \"b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0\": rpc error: code = NotFound desc = could not find container \"b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0\": container with ID starting with b067b924da6877fecad1d00102573df031b79d8d47ae3a65f229715c253b6da0 not found: ID does not exist" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.598402 4807 scope.go:117] "RemoveContainer" containerID="e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b" Nov 27 11:12:34 crc kubenswrapper[4807]: E1127 11:12:34.598814 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b\": container with ID starting with e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b not found: ID does not exist" containerID="e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b" Nov 27 11:12:34 crc kubenswrapper[4807]: I1127 11:12:34.598936 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b"} err="failed to get container status \"e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b\": rpc error: code = NotFound desc = could not find container \"e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b\": container with ID starting with e977dafb2807ae6039659f3a86f75223fa547de715af86b8caf49e20e8f7ea0b not found: ID does not exist" Nov 27 11:12:35 crc kubenswrapper[4807]: I1127 11:12:35.542690 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e639c6-6aad-4df0-b118-cd50f488a026" path="/var/lib/kubelet/pods/54e639c6-6aad-4df0-b118-cd50f488a026/volumes" Nov 27 11:12:35 crc kubenswrapper[4807]: I1127 11:12:35.545229 4807 generic.go:334] "Generic (PLEG): container finished" podID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerID="0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9" exitCode=0 Nov 27 11:12:35 crc kubenswrapper[4807]: I1127 11:12:35.545363 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhk66" event={"ID":"c61dff77-8482-4e06-b99e-72c1cd18c4ca","Type":"ContainerDied","Data":"0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9"} Nov 27 11:12:35 crc kubenswrapper[4807]: I1127 11:12:35.554975 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmwd8" event={"ID":"6fac2baf-01de-4e80-8434-4488846fd7fb","Type":"ContainerStarted","Data":"44be6e4d075507494d251dd3aedd5ec18a03be81700a42aa4734ee19419768a5"} Nov 27 11:12:36 crc kubenswrapper[4807]: I1127 11:12:36.565209 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9x6p" event={"ID":"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52","Type":"ContainerStarted","Data":"e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9"} Nov 27 11:12:36 crc kubenswrapper[4807]: I1127 11:12:36.570456 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhk66" event={"ID":"c61dff77-8482-4e06-b99e-72c1cd18c4ca","Type":"ContainerStarted","Data":"a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe"} Nov 27 11:12:36 crc kubenswrapper[4807]: I1127 11:12:36.573499 4807 generic.go:334] "Generic (PLEG): container finished" podID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerID="44be6e4d075507494d251dd3aedd5ec18a03be81700a42aa4734ee19419768a5" exitCode=0 Nov 27 11:12:36 crc kubenswrapper[4807]: I1127 11:12:36.573539 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmwd8" event={"ID":"6fac2baf-01de-4e80-8434-4488846fd7fb","Type":"ContainerDied","Data":"44be6e4d075507494d251dd3aedd5ec18a03be81700a42aa4734ee19419768a5"} Nov 27 11:12:36 crc kubenswrapper[4807]: I1127 11:12:36.606497 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vhk66" podStartSLOduration=2.77752168 podStartE2EDuration="46.606475317s" podCreationTimestamp="2025-11-27 11:11:50 +0000 UTC" firstStartedPulling="2025-11-27 11:11:52.229412967 +0000 UTC m=+153.328911165" lastFinishedPulling="2025-11-27 11:12:36.058366594 +0000 UTC m=+197.157864802" observedRunningTime="2025-11-27 11:12:36.605830508 +0000 UTC m=+197.705328706" watchObservedRunningTime="2025-11-27 11:12:36.606475317 +0000 UTC m=+197.705973535" Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.354017 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.417714 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.587625 4807 generic.go:334] "Generic (PLEG): container finished" podID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerID="f0c7c74900eb6fedcc44429d2a98cdd1e27b56069ecb54031c294c2f5e4ec3c3" exitCode=0 Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.587709 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9gg" event={"ID":"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5","Type":"ContainerDied","Data":"f0c7c74900eb6fedcc44429d2a98cdd1e27b56069ecb54031c294c2f5e4ec3c3"} Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.594554 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmwd8" event={"ID":"6fac2baf-01de-4e80-8434-4488846fd7fb","Type":"ContainerStarted","Data":"a0516c1f87e57946e7056492ee61d814a58a53eadbc1132bb25e12e35f66bbc7"} Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.596787 4807 generic.go:334] "Generic (PLEG): container finished" podID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerID="e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9" exitCode=0 Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.597174 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9x6p" event={"ID":"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52","Type":"ContainerDied","Data":"e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9"} Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.597192 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9x6p" event={"ID":"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52","Type":"ContainerStarted","Data":"593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66"} Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.621770 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t9x6p" podStartSLOduration=2.692720707 podStartE2EDuration="48.621756047s" podCreationTimestamp="2025-11-27 11:11:49 +0000 UTC" firstStartedPulling="2025-11-27 11:11:51.123353344 +0000 UTC m=+152.222851542" lastFinishedPulling="2025-11-27 11:12:37.052388694 +0000 UTC m=+198.151886882" observedRunningTime="2025-11-27 11:12:37.620543001 +0000 UTC m=+198.720041189" watchObservedRunningTime="2025-11-27 11:12:37.621756047 +0000 UTC m=+198.721254245" Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.652593 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tmwd8" podStartSLOduration=2.400464933 podStartE2EDuration="50.652576074s" podCreationTimestamp="2025-11-27 11:11:47 +0000 UTC" firstStartedPulling="2025-11-27 11:11:48.974432605 +0000 UTC m=+150.073930803" lastFinishedPulling="2025-11-27 11:12:37.226543746 +0000 UTC m=+198.326041944" observedRunningTime="2025-11-27 11:12:37.650511064 +0000 UTC m=+198.750009262" watchObservedRunningTime="2025-11-27 11:12:37.652576074 +0000 UTC m=+198.752074272" Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.683637 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:12:37 crc kubenswrapper[4807]: I1127 11:12:37.683791 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.181480 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 27 11:12:38 crc kubenswrapper[4807]: E1127 11:12:38.182121 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e639c6-6aad-4df0-b118-cd50f488a026" containerName="extract-utilities" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.182144 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e639c6-6aad-4df0-b118-cd50f488a026" containerName="extract-utilities" Nov 27 11:12:38 crc kubenswrapper[4807]: E1127 11:12:38.182169 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e639c6-6aad-4df0-b118-cd50f488a026" containerName="registry-server" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.182177 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e639c6-6aad-4df0-b118-cd50f488a026" containerName="registry-server" Nov 27 11:12:38 crc kubenswrapper[4807]: E1127 11:12:38.182189 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861b7ecf-e258-42a7-a4e8-0319e820dc70" containerName="pruner" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.182197 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="861b7ecf-e258-42a7-a4e8-0319e820dc70" containerName="pruner" Nov 27 11:12:38 crc kubenswrapper[4807]: E1127 11:12:38.182236 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e639c6-6aad-4df0-b118-cd50f488a026" containerName="extract-content" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.182261 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e639c6-6aad-4df0-b118-cd50f488a026" containerName="extract-content" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.191756 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="861b7ecf-e258-42a7-a4e8-0319e820dc70" containerName="pruner" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.191805 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e639c6-6aad-4df0-b118-cd50f488a026" containerName="registry-server" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.192264 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.194330 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.195569 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.195870 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.323451 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7674389a-8181-488b-bb03-d97eee98df00-kube-api-access\") pod \"installer-9-crc\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.323541 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-var-lock\") pod \"installer-9-crc\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.323589 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.424491 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7674389a-8181-488b-bb03-d97eee98df00-kube-api-access\") pod \"installer-9-crc\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.424539 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-var-lock\") pod \"installer-9-crc\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.424568 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.424640 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-kubelet-dir\") pod \"installer-9-crc\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.424750 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-var-lock\") pod \"installer-9-crc\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.442925 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7674389a-8181-488b-bb03-d97eee98df00-kube-api-access\") pod \"installer-9-crc\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.506706 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.603428 4807 generic.go:334] "Generic (PLEG): container finished" podID="14518714-cc42-4323-a3c9-307047368353" containerID="7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c" exitCode=0 Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.603503 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntrpq" event={"ID":"14518714-cc42-4323-a3c9-307047368353","Type":"ContainerDied","Data":"7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c"} Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.620465 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9gg" event={"ID":"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5","Type":"ContainerStarted","Data":"9ac12b95244684cc27613f0aeeb27eb96502f2c4428f581834fc5f68c4b9eda8"} Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.671141 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bl9gg" podStartSLOduration=2.464066637 podStartE2EDuration="49.671110787s" podCreationTimestamp="2025-11-27 11:11:49 +0000 UTC" firstStartedPulling="2025-11-27 11:11:51.123748516 +0000 UTC m=+152.223246714" lastFinishedPulling="2025-11-27 11:12:38.330792666 +0000 UTC m=+199.430290864" observedRunningTime="2025-11-27 11:12:38.654853043 +0000 UTC m=+199.754351241" watchObservedRunningTime="2025-11-27 11:12:38.671110787 +0000 UTC m=+199.770608975" Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.725442 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-tmwd8" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerName="registry-server" probeResult="failure" output=< Nov 27 11:12:38 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Nov 27 11:12:38 crc kubenswrapper[4807]: > Nov 27 11:12:38 crc kubenswrapper[4807]: I1127 11:12:38.949573 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Nov 27 11:12:38 crc kubenswrapper[4807]: W1127 11:12:38.963546 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7674389a_8181_488b_bb03_d97eee98df00.slice/crio-c22ecf272f90841640c72ed319cff6d766eca0d5af8f1ae220c413145d70dd4b WatchSource:0}: Error finding container c22ecf272f90841640c72ed319cff6d766eca0d5af8f1ae220c413145d70dd4b: Status 404 returned error can't find the container with id c22ecf272f90841640c72ed319cff6d766eca0d5af8f1ae220c413145d70dd4b Nov 27 11:12:39 crc kubenswrapper[4807]: I1127 11:12:39.627021 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7674389a-8181-488b-bb03-d97eee98df00","Type":"ContainerStarted","Data":"050a36180f9e7911ef9624ecb3576188de7502de8109999d63a45c386e611687"} Nov 27 11:12:39 crc kubenswrapper[4807]: I1127 11:12:39.627465 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7674389a-8181-488b-bb03-d97eee98df00","Type":"ContainerStarted","Data":"c22ecf272f90841640c72ed319cff6d766eca0d5af8f1ae220c413145d70dd4b"} Nov 27 11:12:39 crc kubenswrapper[4807]: I1127 11:12:39.629021 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8qqm" event={"ID":"16652a33-af22-4522-bd9c-8491bd6ae24f","Type":"ContainerStarted","Data":"62f8d22f042b07bc62620047a44c6324f4ef1f42bd3098efe35cb00fe7c3228b"} Nov 27 11:12:39 crc kubenswrapper[4807]: I1127 11:12:39.631923 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntrpq" event={"ID":"14518714-cc42-4323-a3c9-307047368353","Type":"ContainerStarted","Data":"860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6"} Nov 27 11:12:39 crc kubenswrapper[4807]: I1127 11:12:39.636384 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:12:39 crc kubenswrapper[4807]: I1127 11:12:39.636465 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:12:39 crc kubenswrapper[4807]: I1127 11:12:39.654413 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.654386733 podStartE2EDuration="1.654386733s" podCreationTimestamp="2025-11-27 11:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:12:39.653062944 +0000 UTC m=+200.752561142" watchObservedRunningTime="2025-11-27 11:12:39.654386733 +0000 UTC m=+200.753884931" Nov 27 11:12:39 crc kubenswrapper[4807]: I1127 11:12:39.677940 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ntrpq" podStartSLOduration=2.598768272 podStartE2EDuration="52.677924389s" podCreationTimestamp="2025-11-27 11:11:47 +0000 UTC" firstStartedPulling="2025-11-27 11:11:49.026653959 +0000 UTC m=+150.126152157" lastFinishedPulling="2025-11-27 11:12:39.105810076 +0000 UTC m=+200.205308274" observedRunningTime="2025-11-27 11:12:39.674124648 +0000 UTC m=+200.773622846" watchObservedRunningTime="2025-11-27 11:12:39.677924389 +0000 UTC m=+200.777422587" Nov 27 11:12:40 crc kubenswrapper[4807]: I1127 11:12:40.077420 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:12:40 crc kubenswrapper[4807]: I1127 11:12:40.077723 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:12:40 crc kubenswrapper[4807]: I1127 11:12:40.125284 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:12:40 crc kubenswrapper[4807]: I1127 11:12:40.639830 4807 generic.go:334] "Generic (PLEG): container finished" podID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerID="62f8d22f042b07bc62620047a44c6324f4ef1f42bd3098efe35cb00fe7c3228b" exitCode=0 Nov 27 11:12:40 crc kubenswrapper[4807]: I1127 11:12:40.639954 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8qqm" event={"ID":"16652a33-af22-4522-bd9c-8491bd6ae24f","Type":"ContainerDied","Data":"62f8d22f042b07bc62620047a44c6324f4ef1f42bd3098efe35cb00fe7c3228b"} Nov 27 11:12:40 crc kubenswrapper[4807]: I1127 11:12:40.665382 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:12:40 crc kubenswrapper[4807]: I1127 11:12:40.665444 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:12:40 crc kubenswrapper[4807]: I1127 11:12:40.691956 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-bl9gg" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerName="registry-server" probeResult="failure" output=< Nov 27 11:12:40 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Nov 27 11:12:40 crc kubenswrapper[4807]: > Nov 27 11:12:41 crc kubenswrapper[4807]: I1127 11:12:41.647834 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8qqm" event={"ID":"16652a33-af22-4522-bd9c-8491bd6ae24f","Type":"ContainerStarted","Data":"d4489a3629cd407f31cd28c9395f8a2974c5a840f5eef1641449d419bb82ed8c"} Nov 27 11:12:41 crc kubenswrapper[4807]: I1127 11:12:41.664268 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f8qqm" podStartSLOduration=2.331436013 podStartE2EDuration="54.664223167s" podCreationTimestamp="2025-11-27 11:11:47 +0000 UTC" firstStartedPulling="2025-11-27 11:11:49.02332209 +0000 UTC m=+150.122820288" lastFinishedPulling="2025-11-27 11:12:41.356109244 +0000 UTC m=+202.455607442" observedRunningTime="2025-11-27 11:12:41.663217428 +0000 UTC m=+202.762715636" watchObservedRunningTime="2025-11-27 11:12:41.664223167 +0000 UTC m=+202.763721365" Nov 27 11:12:41 crc kubenswrapper[4807]: I1127 11:12:41.705326 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vhk66" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerName="registry-server" probeResult="failure" output=< Nov 27 11:12:41 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Nov 27 11:12:41 crc kubenswrapper[4807]: > Nov 27 11:12:47 crc kubenswrapper[4807]: I1127 11:12:47.723009 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:12:47 crc kubenswrapper[4807]: I1127 11:12:47.767231 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:12:47 crc kubenswrapper[4807]: I1127 11:12:47.934477 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:12:47 crc kubenswrapper[4807]: I1127 11:12:47.934553 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:12:47 crc kubenswrapper[4807]: I1127 11:12:47.976918 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:12:48 crc kubenswrapper[4807]: I1127 11:12:48.174811 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:12:48 crc kubenswrapper[4807]: I1127 11:12:48.174864 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:12:48 crc kubenswrapper[4807]: I1127 11:12:48.209596 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:12:48 crc kubenswrapper[4807]: I1127 11:12:48.728872 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:12:48 crc kubenswrapper[4807]: I1127 11:12:48.750714 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:12:48 crc kubenswrapper[4807]: I1127 11:12:48.952427 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmwd8"] Nov 27 11:12:49 crc kubenswrapper[4807]: I1127 11:12:49.691101 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tmwd8" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerName="registry-server" containerID="cri-o://a0516c1f87e57946e7056492ee61d814a58a53eadbc1132bb25e12e35f66bbc7" gracePeriod=2 Nov 27 11:12:49 crc kubenswrapper[4807]: I1127 11:12:49.692554 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:12:49 crc kubenswrapper[4807]: I1127 11:12:49.731915 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.119456 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.353098 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntrpq"] Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.704039 4807 generic.go:334] "Generic (PLEG): container finished" podID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerID="a0516c1f87e57946e7056492ee61d814a58a53eadbc1132bb25e12e35f66bbc7" exitCode=0 Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.704371 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ntrpq" podUID="14518714-cc42-4323-a3c9-307047368353" containerName="registry-server" containerID="cri-o://860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6" gracePeriod=2 Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.705755 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmwd8" event={"ID":"6fac2baf-01de-4e80-8434-4488846fd7fb","Type":"ContainerDied","Data":"a0516c1f87e57946e7056492ee61d814a58a53eadbc1132bb25e12e35f66bbc7"} Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.735164 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.804889 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.895413 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.922094 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.922141 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.922180 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.922643 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.922693 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e" gracePeriod=600 Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.993888 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qxg4\" (UniqueName: \"kubernetes.io/projected/6fac2baf-01de-4e80-8434-4488846fd7fb-kube-api-access-5qxg4\") pod \"6fac2baf-01de-4e80-8434-4488846fd7fb\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.994066 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-utilities\") pod \"6fac2baf-01de-4e80-8434-4488846fd7fb\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.994099 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-catalog-content\") pod \"6fac2baf-01de-4e80-8434-4488846fd7fb\" (UID: \"6fac2baf-01de-4e80-8434-4488846fd7fb\") " Nov 27 11:12:50 crc kubenswrapper[4807]: I1127 11:12:50.995088 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-utilities" (OuterVolumeSpecName: "utilities") pod "6fac2baf-01de-4e80-8434-4488846fd7fb" (UID: "6fac2baf-01de-4e80-8434-4488846fd7fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.001991 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fac2baf-01de-4e80-8434-4488846fd7fb-kube-api-access-5qxg4" (OuterVolumeSpecName: "kube-api-access-5qxg4") pod "6fac2baf-01de-4e80-8434-4488846fd7fb" (UID: "6fac2baf-01de-4e80-8434-4488846fd7fb"). InnerVolumeSpecName "kube-api-access-5qxg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.063827 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fac2baf-01de-4e80-8434-4488846fd7fb" (UID: "6fac2baf-01de-4e80-8434-4488846fd7fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.095146 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qxg4\" (UniqueName: \"kubernetes.io/projected/6fac2baf-01de-4e80-8434-4488846fd7fb-kube-api-access-5qxg4\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.095183 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.095193 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fac2baf-01de-4e80-8434-4488846fd7fb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.101240 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.195616 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjt52\" (UniqueName: \"kubernetes.io/projected/14518714-cc42-4323-a3c9-307047368353-kube-api-access-zjt52\") pod \"14518714-cc42-4323-a3c9-307047368353\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.195708 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-catalog-content\") pod \"14518714-cc42-4323-a3c9-307047368353\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.195744 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-utilities\") pod \"14518714-cc42-4323-a3c9-307047368353\" (UID: \"14518714-cc42-4323-a3c9-307047368353\") " Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.198611 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-utilities" (OuterVolumeSpecName: "utilities") pod "14518714-cc42-4323-a3c9-307047368353" (UID: "14518714-cc42-4323-a3c9-307047368353"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.198693 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14518714-cc42-4323-a3c9-307047368353-kube-api-access-zjt52" (OuterVolumeSpecName: "kube-api-access-zjt52") pod "14518714-cc42-4323-a3c9-307047368353" (UID: "14518714-cc42-4323-a3c9-307047368353"). InnerVolumeSpecName "kube-api-access-zjt52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.254004 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14518714-cc42-4323-a3c9-307047368353" (UID: "14518714-cc42-4323-a3c9-307047368353"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.297748 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjt52\" (UniqueName: \"kubernetes.io/projected/14518714-cc42-4323-a3c9-307047368353-kube-api-access-zjt52\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.297794 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.297808 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14518714-cc42-4323-a3c9-307047368353-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.714492 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmwd8" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.714497 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmwd8" event={"ID":"6fac2baf-01de-4e80-8434-4488846fd7fb","Type":"ContainerDied","Data":"e6b164bd1c2af3b650cb30bf67ccb8c9ea8397e89b05d4e31c7c374f86b9fe36"} Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.715041 4807 scope.go:117] "RemoveContainer" containerID="a0516c1f87e57946e7056492ee61d814a58a53eadbc1132bb25e12e35f66bbc7" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.717189 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e" exitCode=0 Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.717280 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e"} Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.717367 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"cc81507d52c9f1bfb0bb1ff2c6a207a6a959b377cb7154504d0530b5e35f12d5"} Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.720683 4807 generic.go:334] "Generic (PLEG): container finished" podID="14518714-cc42-4323-a3c9-307047368353" containerID="860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6" exitCode=0 Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.721705 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ntrpq" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.721767 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntrpq" event={"ID":"14518714-cc42-4323-a3c9-307047368353","Type":"ContainerDied","Data":"860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6"} Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.721814 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ntrpq" event={"ID":"14518714-cc42-4323-a3c9-307047368353","Type":"ContainerDied","Data":"a0099fd86f5e3dc5b03c6953de8910e1005ff39ee9c53f90967b470cb9f9e699"} Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.752119 4807 scope.go:117] "RemoveContainer" containerID="44be6e4d075507494d251dd3aedd5ec18a03be81700a42aa4734ee19419768a5" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.764592 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmwd8"] Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.770509 4807 scope.go:117] "RemoveContainer" containerID="c42149efad8976068d96e9365a4d6b4c271468a3eed7f62a906b56b46d98d908" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.771047 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tmwd8"] Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.775218 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ntrpq"] Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.778912 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ntrpq"] Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.784913 4807 scope.go:117] "RemoveContainer" containerID="860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.802688 4807 scope.go:117] "RemoveContainer" containerID="7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.822732 4807 scope.go:117] "RemoveContainer" containerID="d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.863883 4807 scope.go:117] "RemoveContainer" containerID="860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6" Nov 27 11:12:51 crc kubenswrapper[4807]: E1127 11:12:51.864284 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6\": container with ID starting with 860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6 not found: ID does not exist" containerID="860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.864364 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6"} err="failed to get container status \"860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6\": rpc error: code = NotFound desc = could not find container \"860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6\": container with ID starting with 860b9eee03a484f38d0d4e4b13dfa6f582f18a5225432be87816010d87e477b6 not found: ID does not exist" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.864404 4807 scope.go:117] "RemoveContainer" containerID="7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c" Nov 27 11:12:51 crc kubenswrapper[4807]: E1127 11:12:51.864875 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c\": container with ID starting with 7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c not found: ID does not exist" containerID="7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.864905 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c"} err="failed to get container status \"7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c\": rpc error: code = NotFound desc = could not find container \"7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c\": container with ID starting with 7ee7c3df800cb8dce1c414b9c92df5b2ada5671387defa14cae7cc2ba49d780c not found: ID does not exist" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.864924 4807 scope.go:117] "RemoveContainer" containerID="d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8" Nov 27 11:12:51 crc kubenswrapper[4807]: E1127 11:12:51.865219 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8\": container with ID starting with d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8 not found: ID does not exist" containerID="d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8" Nov 27 11:12:51 crc kubenswrapper[4807]: I1127 11:12:51.865278 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8"} err="failed to get container status \"d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8\": rpc error: code = NotFound desc = could not find container \"d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8\": container with ID starting with d65b7e049aca72d2e5ab254b49abf9c7700afa16e3c77a1e44106d4cc4f859c8 not found: ID does not exist" Nov 27 11:12:52 crc kubenswrapper[4807]: I1127 11:12:52.750090 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9x6p"] Nov 27 11:12:52 crc kubenswrapper[4807]: I1127 11:12:52.750836 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t9x6p" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerName="registry-server" containerID="cri-o://593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66" gracePeriod=2 Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.190694 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.323831 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-utilities\") pod \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.323954 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdlqt\" (UniqueName: \"kubernetes.io/projected/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-kube-api-access-gdlqt\") pod \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.324033 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-catalog-content\") pod \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\" (UID: \"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52\") " Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.324558 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-utilities" (OuterVolumeSpecName: "utilities") pod "1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" (UID: "1df798f7-f40a-41b8-b4fb-1d6cf4e05f52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.325873 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.329529 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-kube-api-access-gdlqt" (OuterVolumeSpecName: "kube-api-access-gdlqt") pod "1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" (UID: "1df798f7-f40a-41b8-b4fb-1d6cf4e05f52"). InnerVolumeSpecName "kube-api-access-gdlqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.343127 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" (UID: "1df798f7-f40a-41b8-b4fb-1d6cf4e05f52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.427329 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdlqt\" (UniqueName: \"kubernetes.io/projected/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-kube-api-access-gdlqt\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.427366 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.538617 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14518714-cc42-4323-a3c9-307047368353" path="/var/lib/kubelet/pods/14518714-cc42-4323-a3c9-307047368353/volumes" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.539869 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" path="/var/lib/kubelet/pods/6fac2baf-01de-4e80-8434-4488846fd7fb/volumes" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.606049 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" podUID="7462250b-699f-4fff-9600-8dff49efc2e8" containerName="oauth-openshift" containerID="cri-o://ef36d13432a2b1dcb6d2cbce900931c0cfdef30f63cadd500d7c9d45f95feabc" gracePeriod=15 Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.736313 4807 generic.go:334] "Generic (PLEG): container finished" podID="7462250b-699f-4fff-9600-8dff49efc2e8" containerID="ef36d13432a2b1dcb6d2cbce900931c0cfdef30f63cadd500d7c9d45f95feabc" exitCode=0 Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.736396 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" event={"ID":"7462250b-699f-4fff-9600-8dff49efc2e8","Type":"ContainerDied","Data":"ef36d13432a2b1dcb6d2cbce900931c0cfdef30f63cadd500d7c9d45f95feabc"} Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.746009 4807 generic.go:334] "Generic (PLEG): container finished" podID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerID="593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66" exitCode=0 Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.746054 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9x6p" event={"ID":"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52","Type":"ContainerDied","Data":"593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66"} Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.746093 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t9x6p" event={"ID":"1df798f7-f40a-41b8-b4fb-1d6cf4e05f52","Type":"ContainerDied","Data":"a0e98ef7a955da5902830d51bb1626c46e361fd93af849d6e70ad6ab1a482e0a"} Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.746117 4807 scope.go:117] "RemoveContainer" containerID="593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.746165 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t9x6p" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.763983 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9x6p"] Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.767940 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t9x6p"] Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.773807 4807 scope.go:117] "RemoveContainer" containerID="e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.796204 4807 scope.go:117] "RemoveContainer" containerID="a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.815205 4807 scope.go:117] "RemoveContainer" containerID="593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66" Nov 27 11:12:53 crc kubenswrapper[4807]: E1127 11:12:53.815880 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66\": container with ID starting with 593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66 not found: ID does not exist" containerID="593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.815920 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66"} err="failed to get container status \"593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66\": rpc error: code = NotFound desc = could not find container \"593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66\": container with ID starting with 593696e870afdf875df7268c71b41837621c5b011f473cf4c1df325bdf414e66 not found: ID does not exist" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.815947 4807 scope.go:117] "RemoveContainer" containerID="e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9" Nov 27 11:12:53 crc kubenswrapper[4807]: E1127 11:12:53.816391 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9\": container with ID starting with e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9 not found: ID does not exist" containerID="e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.816422 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9"} err="failed to get container status \"e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9\": rpc error: code = NotFound desc = could not find container \"e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9\": container with ID starting with e105fd3cf09285c67abb93cb069639153bd350bf2d3368168b2c2fea3f1e1ec9 not found: ID does not exist" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.816471 4807 scope.go:117] "RemoveContainer" containerID="a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7" Nov 27 11:12:53 crc kubenswrapper[4807]: E1127 11:12:53.816980 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7\": container with ID starting with a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7 not found: ID does not exist" containerID="a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.817024 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7"} err="failed to get container status \"a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7\": rpc error: code = NotFound desc = could not find container \"a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7\": container with ID starting with a968c44c5a31646c4c93cbba3d53241788c4ba62745b81fe4ed6c0aafde051f7 not found: ID does not exist" Nov 27 11:12:53 crc kubenswrapper[4807]: I1127 11:12:53.998976 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135236 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135326 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7462250b-699f-4fff-9600-8dff49efc2e8-audit-dir\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135375 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-session\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135395 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-provider-selection\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135444 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135478 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135533 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-login\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135567 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4c84\" (UniqueName: \"kubernetes.io/projected/7462250b-699f-4fff-9600-8dff49efc2e8-kube-api-access-z4c84\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135649 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135465 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7462250b-699f-4fff-9600-8dff49efc2e8-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.135963 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.136088 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.136276 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-router-certs\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.136370 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.136417 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.136459 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.136480 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig\") pod \"7462250b-699f-4fff-9600-8dff49efc2e8\" (UID: \"7462250b-699f-4fff-9600-8dff49efc2e8\") " Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.137014 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.137035 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.137045 4807 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7462250b-699f-4fff-9600-8dff49efc2e8-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.137129 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.137240 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.154153 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7462250b-699f-4fff-9600-8dff49efc2e8-kube-api-access-z4c84" (OuterVolumeSpecName: "kube-api-access-z4c84") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "kube-api-access-z4c84". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.153838 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.164432 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.164667 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.165821 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.166136 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.166454 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.166652 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.168678 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "7462250b-699f-4fff-9600-8dff49efc2e8" (UID: "7462250b-699f-4fff-9600-8dff49efc2e8"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238631 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238666 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238679 4807 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238691 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238701 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238712 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238720 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238730 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238740 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238748 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/7462250b-699f-4fff-9600-8dff49efc2e8-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.238756 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4c84\" (UniqueName: \"kubernetes.io/projected/7462250b-699f-4fff-9600-8dff49efc2e8-kube-api-access-z4c84\") on node \"crc\" DevicePath \"\"" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.754226 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" event={"ID":"7462250b-699f-4fff-9600-8dff49efc2e8","Type":"ContainerDied","Data":"c132e74c9ecf1bcfe0c0fb26a4ab006461b2e984089a0ba2a3d919581a21ba32"} Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.754304 4807 scope.go:117] "RemoveContainer" containerID="ef36d13432a2b1dcb6d2cbce900931c0cfdef30f63cadd500d7c9d45f95feabc" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.754382 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-qsdql" Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.777557 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qsdql"] Nov 27 11:12:54 crc kubenswrapper[4807]: I1127 11:12:54.780628 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-qsdql"] Nov 27 11:12:55 crc kubenswrapper[4807]: I1127 11:12:55.547140 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" path="/var/lib/kubelet/pods/1df798f7-f40a-41b8-b4fb-1d6cf4e05f52/volumes" Nov 27 11:12:55 crc kubenswrapper[4807]: I1127 11:12:55.548920 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7462250b-699f-4fff-9600-8dff49efc2e8" path="/var/lib/kubelet/pods/7462250b-699f-4fff-9600-8dff49efc2e8/volumes" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.088858 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk"] Nov 27 11:12:59 crc kubenswrapper[4807]: E1127 11:12:59.089528 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerName="extract-content" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.089844 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerName="extract-content" Nov 27 11:12:59 crc kubenswrapper[4807]: E1127 11:12:59.089877 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerName="extract-utilities" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.089890 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerName="extract-utilities" Nov 27 11:12:59 crc kubenswrapper[4807]: E1127 11:12:59.089906 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerName="registry-server" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.089920 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerName="registry-server" Nov 27 11:12:59 crc kubenswrapper[4807]: E1127 11:12:59.089938 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14518714-cc42-4323-a3c9-307047368353" containerName="extract-content" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.089950 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="14518714-cc42-4323-a3c9-307047368353" containerName="extract-content" Nov 27 11:12:59 crc kubenswrapper[4807]: E1127 11:12:59.089967 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerName="extract-utilities" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.089979 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerName="extract-utilities" Nov 27 11:12:59 crc kubenswrapper[4807]: E1127 11:12:59.089996 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerName="extract-content" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.090008 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerName="extract-content" Nov 27 11:12:59 crc kubenswrapper[4807]: E1127 11:12:59.090026 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14518714-cc42-4323-a3c9-307047368353" containerName="registry-server" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.090037 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="14518714-cc42-4323-a3c9-307047368353" containerName="registry-server" Nov 27 11:12:59 crc kubenswrapper[4807]: E1127 11:12:59.090061 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7462250b-699f-4fff-9600-8dff49efc2e8" containerName="oauth-openshift" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.090074 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7462250b-699f-4fff-9600-8dff49efc2e8" containerName="oauth-openshift" Nov 27 11:12:59 crc kubenswrapper[4807]: E1127 11:12:59.090091 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerName="registry-server" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.090104 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerName="registry-server" Nov 27 11:12:59 crc kubenswrapper[4807]: E1127 11:12:59.090121 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14518714-cc42-4323-a3c9-307047368353" containerName="extract-utilities" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.090132 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="14518714-cc42-4323-a3c9-307047368353" containerName="extract-utilities" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.090329 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="14518714-cc42-4323-a3c9-307047368353" containerName="registry-server" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.090355 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7462250b-699f-4fff-9600-8dff49efc2e8" containerName="oauth-openshift" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.090372 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fac2baf-01de-4e80-8434-4488846fd7fb" containerName="registry-server" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.090385 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df798f7-f40a-41b8-b4fb-1d6cf4e05f52" containerName="registry-server" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.090974 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.092843 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.103684 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.104050 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.104066 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.104210 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.104782 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.105155 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.105466 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.106124 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.107610 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.107604 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.107866 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.112552 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.116206 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk"] Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.118285 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.125858 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209085 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209158 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-policies\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209191 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209238 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209296 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk7bf\" (UniqueName: \"kubernetes.io/projected/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-kube-api-access-lk7bf\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209317 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209365 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209395 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209447 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209469 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-dir\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209488 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209539 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-session\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209572 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.209631 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310587 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310641 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310694 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310715 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-policies\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310734 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310751 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310772 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310793 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7bf\" (UniqueName: \"kubernetes.io/projected/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-kube-api-access-lk7bf\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310817 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310843 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310860 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310876 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-dir\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310892 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.310908 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-session\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.311600 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-policies\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.311670 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.311851 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-dir\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.312502 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.312842 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.320161 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.320782 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.320849 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.322484 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.323713 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.323917 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.325547 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.328840 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-session\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.335032 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk7bf\" (UniqueName: \"kubernetes.io/projected/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-kube-api-access-lk7bf\") pod \"oauth-openshift-6c8d5d4f46-t6sqk\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.418223 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:12:59 crc kubenswrapper[4807]: I1127 11:12:59.830797 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk"] Nov 27 11:12:59 crc kubenswrapper[4807]: W1127 11:12:59.841531 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d6f0746_0c61_45ef_94d0_f3d4bb789f1f.slice/crio-207fae96195cf5d8aad19742abd0920d7438967050e106ea398030953d46dea1 WatchSource:0}: Error finding container 207fae96195cf5d8aad19742abd0920d7438967050e106ea398030953d46dea1: Status 404 returned error can't find the container with id 207fae96195cf5d8aad19742abd0920d7438967050e106ea398030953d46dea1 Nov 27 11:13:00 crc kubenswrapper[4807]: I1127 11:13:00.807884 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" event={"ID":"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f","Type":"ContainerStarted","Data":"d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea"} Nov 27 11:13:00 crc kubenswrapper[4807]: I1127 11:13:00.807935 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" event={"ID":"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f","Type":"ContainerStarted","Data":"207fae96195cf5d8aad19742abd0920d7438967050e106ea398030953d46dea1"} Nov 27 11:13:00 crc kubenswrapper[4807]: I1127 11:13:00.808116 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:13:00 crc kubenswrapper[4807]: I1127 11:13:00.816832 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:13:00 crc kubenswrapper[4807]: I1127 11:13:00.827669 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" podStartSLOduration=32.827638502 podStartE2EDuration="32.827638502s" podCreationTimestamp="2025-11-27 11:12:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:13:00.823408939 +0000 UTC m=+221.922907147" watchObservedRunningTime="2025-11-27 11:13:00.827638502 +0000 UTC m=+221.927136740" Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.721358 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f8qqm"] Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.723165 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f8qqm" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerName="registry-server" containerID="cri-o://d4489a3629cd407f31cd28c9395f8a2974c5a840f5eef1641449d419bb82ed8c" gracePeriod=30 Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.739681 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4zn8d"] Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.740121 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4zn8d" podUID="7984b376-029e-465e-893e-f62f047ee418" containerName="registry-server" containerID="cri-o://43436aee429a3e83d476f833f00c7d06382a1694db5d0a1b763cda166725e3d3" gracePeriod=30 Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.755393 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-82ntv"] Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.756577 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" containerName="marketplace-operator" containerID="cri-o://6617f5552a5be9cb1d47e1efef730cfc812a511591b28d13d051ea011529efd3" gracePeriod=30 Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.771139 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bl9gg"] Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.771441 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bl9gg" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerName="registry-server" containerID="cri-o://9ac12b95244684cc27613f0aeeb27eb96502f2c4428f581834fc5f68c4b9eda8" gracePeriod=30 Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.779459 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vhk66"] Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.779703 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vhk66" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerName="registry-server" containerID="cri-o://a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe" gracePeriod=30 Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.785092 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q6pz5"] Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.786016 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.787899 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q6pz5"] Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.904139 4807 generic.go:334] "Generic (PLEG): container finished" podID="65e5ef23-71ad-40ae-81bb-e94d9d298087" containerID="6617f5552a5be9cb1d47e1efef730cfc812a511591b28d13d051ea011529efd3" exitCode=0 Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.904427 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" event={"ID":"65e5ef23-71ad-40ae-81bb-e94d9d298087","Type":"ContainerDied","Data":"6617f5552a5be9cb1d47e1efef730cfc812a511591b28d13d051ea011529efd3"} Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.908105 4807 generic.go:334] "Generic (PLEG): container finished" podID="7984b376-029e-465e-893e-f62f047ee418" containerID="43436aee429a3e83d476f833f00c7d06382a1694db5d0a1b763cda166725e3d3" exitCode=0 Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.908215 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zn8d" event={"ID":"7984b376-029e-465e-893e-f62f047ee418","Type":"ContainerDied","Data":"43436aee429a3e83d476f833f00c7d06382a1694db5d0a1b763cda166725e3d3"} Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.909791 4807 generic.go:334] "Generic (PLEG): container finished" podID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerID="d4489a3629cd407f31cd28c9395f8a2974c5a840f5eef1641449d419bb82ed8c" exitCode=0 Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.909865 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8qqm" event={"ID":"16652a33-af22-4522-bd9c-8491bd6ae24f","Type":"ContainerDied","Data":"d4489a3629cd407f31cd28c9395f8a2974c5a840f5eef1641449d419bb82ed8c"} Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.913300 4807 generic.go:334] "Generic (PLEG): container finished" podID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerID="9ac12b95244684cc27613f0aeeb27eb96502f2c4428f581834fc5f68c4b9eda8" exitCode=0 Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.913324 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9gg" event={"ID":"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5","Type":"ContainerDied","Data":"9ac12b95244684cc27613f0aeeb27eb96502f2c4428f581834fc5f68c4b9eda8"} Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.928080 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5wqb\" (UniqueName: \"kubernetes.io/projected/c5121ac2-4e63-4d46-b899-89bbfbb19550-kube-api-access-q5wqb\") pod \"marketplace-operator-79b997595-q6pz5\" (UID: \"c5121ac2-4e63-4d46-b899-89bbfbb19550\") " pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.928188 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5121ac2-4e63-4d46-b899-89bbfbb19550-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q6pz5\" (UID: \"c5121ac2-4e63-4d46-b899-89bbfbb19550\") " pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:15 crc kubenswrapper[4807]: I1127 11:13:15.928233 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c5121ac2-4e63-4d46-b899-89bbfbb19550-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q6pz5\" (UID: \"c5121ac2-4e63-4d46-b899-89bbfbb19550\") " pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.029027 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c5121ac2-4e63-4d46-b899-89bbfbb19550-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q6pz5\" (UID: \"c5121ac2-4e63-4d46-b899-89bbfbb19550\") " pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.029087 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5wqb\" (UniqueName: \"kubernetes.io/projected/c5121ac2-4e63-4d46-b899-89bbfbb19550-kube-api-access-q5wqb\") pod \"marketplace-operator-79b997595-q6pz5\" (UID: \"c5121ac2-4e63-4d46-b899-89bbfbb19550\") " pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.029164 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5121ac2-4e63-4d46-b899-89bbfbb19550-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q6pz5\" (UID: \"c5121ac2-4e63-4d46-b899-89bbfbb19550\") " pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.030492 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5121ac2-4e63-4d46-b899-89bbfbb19550-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-q6pz5\" (UID: \"c5121ac2-4e63-4d46-b899-89bbfbb19550\") " pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.036285 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/c5121ac2-4e63-4d46-b899-89bbfbb19550-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-q6pz5\" (UID: \"c5121ac2-4e63-4d46-b899-89bbfbb19550\") " pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.044939 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5wqb\" (UniqueName: \"kubernetes.io/projected/c5121ac2-4e63-4d46-b899-89bbfbb19550-kube-api-access-q5wqb\") pod \"marketplace-operator-79b997595-q6pz5\" (UID: \"c5121ac2-4e63-4d46-b899-89bbfbb19550\") " pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.194317 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.198163 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.214961 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.215524 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.222937 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.240642 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335179 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktp87\" (UniqueName: \"kubernetes.io/projected/65e5ef23-71ad-40ae-81bb-e94d9d298087-kube-api-access-ktp87\") pod \"65e5ef23-71ad-40ae-81bb-e94d9d298087\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335236 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-catalog-content\") pod \"7984b376-029e-465e-893e-f62f047ee418\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335295 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hkt6\" (UniqueName: \"kubernetes.io/projected/c61dff77-8482-4e06-b99e-72c1cd18c4ca-kube-api-access-9hkt6\") pod \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335320 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg42b\" (UniqueName: \"kubernetes.io/projected/7984b376-029e-465e-893e-f62f047ee418-kube-api-access-jg42b\") pod \"7984b376-029e-465e-893e-f62f047ee418\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335360 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-utilities\") pod \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335377 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzlmq\" (UniqueName: \"kubernetes.io/projected/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-kube-api-access-jzlmq\") pod \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335404 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-catalog-content\") pod \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335422 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-trusted-ca\") pod \"65e5ef23-71ad-40ae-81bb-e94d9d298087\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335446 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-catalog-content\") pod \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\" (UID: \"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335465 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj497\" (UniqueName: \"kubernetes.io/projected/16652a33-af22-4522-bd9c-8491bd6ae24f-kube-api-access-tj497\") pod \"16652a33-af22-4522-bd9c-8491bd6ae24f\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335484 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-utilities\") pod \"7984b376-029e-465e-893e-f62f047ee418\" (UID: \"7984b376-029e-465e-893e-f62f047ee418\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335500 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-operator-metrics\") pod \"65e5ef23-71ad-40ae-81bb-e94d9d298087\" (UID: \"65e5ef23-71ad-40ae-81bb-e94d9d298087\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335518 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-utilities\") pod \"16652a33-af22-4522-bd9c-8491bd6ae24f\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335539 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-utilities\") pod \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\" (UID: \"c61dff77-8482-4e06-b99e-72c1cd18c4ca\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.335558 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-catalog-content\") pod \"16652a33-af22-4522-bd9c-8491bd6ae24f\" (UID: \"16652a33-af22-4522-bd9c-8491bd6ae24f\") " Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.338527 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16652a33-af22-4522-bd9c-8491bd6ae24f-kube-api-access-tj497" (OuterVolumeSpecName: "kube-api-access-tj497") pod "16652a33-af22-4522-bd9c-8491bd6ae24f" (UID: "16652a33-af22-4522-bd9c-8491bd6ae24f"). InnerVolumeSpecName "kube-api-access-tj497". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.338769 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e5ef23-71ad-40ae-81bb-e94d9d298087-kube-api-access-ktp87" (OuterVolumeSpecName: "kube-api-access-ktp87") pod "65e5ef23-71ad-40ae-81bb-e94d9d298087" (UID: "65e5ef23-71ad-40ae-81bb-e94d9d298087"). InnerVolumeSpecName "kube-api-access-ktp87". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.339284 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-kube-api-access-jzlmq" (OuterVolumeSpecName: "kube-api-access-jzlmq") pod "4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" (UID: "4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5"). InnerVolumeSpecName "kube-api-access-jzlmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.339878 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-utilities" (OuterVolumeSpecName: "utilities") pod "16652a33-af22-4522-bd9c-8491bd6ae24f" (UID: "16652a33-af22-4522-bd9c-8491bd6ae24f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.340226 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "65e5ef23-71ad-40ae-81bb-e94d9d298087" (UID: "65e5ef23-71ad-40ae-81bb-e94d9d298087"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.340277 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-utilities" (OuterVolumeSpecName: "utilities") pod "4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" (UID: "4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.341655 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-utilities" (OuterVolumeSpecName: "utilities") pod "c61dff77-8482-4e06-b99e-72c1cd18c4ca" (UID: "c61dff77-8482-4e06-b99e-72c1cd18c4ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.343133 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "65e5ef23-71ad-40ae-81bb-e94d9d298087" (UID: "65e5ef23-71ad-40ae-81bb-e94d9d298087"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.348959 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-utilities" (OuterVolumeSpecName: "utilities") pod "7984b376-029e-465e-893e-f62f047ee418" (UID: "7984b376-029e-465e-893e-f62f047ee418"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.356479 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7984b376-029e-465e-893e-f62f047ee418-kube-api-access-jg42b" (OuterVolumeSpecName: "kube-api-access-jg42b") pod "7984b376-029e-465e-893e-f62f047ee418" (UID: "7984b376-029e-465e-893e-f62f047ee418"). InnerVolumeSpecName "kube-api-access-jg42b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.356513 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c61dff77-8482-4e06-b99e-72c1cd18c4ca-kube-api-access-9hkt6" (OuterVolumeSpecName: "kube-api-access-9hkt6") pod "c61dff77-8482-4e06-b99e-72c1cd18c4ca" (UID: "c61dff77-8482-4e06-b99e-72c1cd18c4ca"). InnerVolumeSpecName "kube-api-access-9hkt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.370789 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" (UID: "4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.416786 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7984b376-029e-465e-893e-f62f047ee418" (UID: "7984b376-029e-465e-893e-f62f047ee418"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.417944 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16652a33-af22-4522-bd9c-8491bd6ae24f" (UID: "16652a33-af22-4522-bd9c-8491bd6ae24f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.436880 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktp87\" (UniqueName: \"kubernetes.io/projected/65e5ef23-71ad-40ae-81bb-e94d9d298087-kube-api-access-ktp87\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.436924 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.436938 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hkt6\" (UniqueName: \"kubernetes.io/projected/c61dff77-8482-4e06-b99e-72c1cd18c4ca-kube-api-access-9hkt6\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.436950 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg42b\" (UniqueName: \"kubernetes.io/projected/7984b376-029e-465e-893e-f62f047ee418-kube-api-access-jg42b\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.436962 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.436973 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzlmq\" (UniqueName: \"kubernetes.io/projected/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-kube-api-access-jzlmq\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.436985 4807 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.436996 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.437032 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj497\" (UniqueName: \"kubernetes.io/projected/16652a33-af22-4522-bd9c-8491bd6ae24f-kube-api-access-tj497\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.437042 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7984b376-029e-465e-893e-f62f047ee418-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.437056 4807 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/65e5ef23-71ad-40ae-81bb-e94d9d298087-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.437067 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.437076 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.437100 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16652a33-af22-4522-bd9c-8491bd6ae24f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.450761 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c61dff77-8482-4e06-b99e-72c1cd18c4ca" (UID: "c61dff77-8482-4e06-b99e-72c1cd18c4ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.537834 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c61dff77-8482-4e06-b99e-72c1cd18c4ca-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.636359 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-q6pz5"] Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.847484 4807 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.847932 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerName="extract-utilities" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.847944 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerName="extract-utilities" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.847953 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerName="extract-content" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.847959 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerName="extract-content" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.847968 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerName="extract-utilities" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.847974 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerName="extract-utilities" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.847982 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" containerName="marketplace-operator" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.847989 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" containerName="marketplace-operator" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.848000 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848005 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.848014 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848020 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.848029 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerName="extract-content" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848034 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerName="extract-content" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.848042 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerName="extract-content" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848047 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerName="extract-content" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.848054 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerName="extract-utilities" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848060 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerName="extract-utilities" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.848069 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7984b376-029e-465e-893e-f62f047ee418" containerName="extract-utilities" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848225 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7984b376-029e-465e-893e-f62f047ee418" containerName="extract-utilities" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.848232 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7984b376-029e-465e-893e-f62f047ee418" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848238 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7984b376-029e-465e-893e-f62f047ee418" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.848260 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848266 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.848272 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7984b376-029e-465e-893e-f62f047ee418" containerName="extract-content" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848277 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7984b376-029e-465e-893e-f62f047ee418" containerName="extract-content" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848399 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7984b376-029e-465e-893e-f62f047ee418" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848411 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848421 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848428 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" containerName="marketplace-operator" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848438 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" containerName="registry-server" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.848787 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.853950 4807 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.854300 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b" gracePeriod=15 Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.854337 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977" gracePeriod=15 Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.854361 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd" gracePeriod=15 Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.854356 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71" gracePeriod=15 Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.854321 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b" gracePeriod=15 Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.855678 4807 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.855822 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.855833 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.855844 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.855857 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.855865 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.855872 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.855879 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.855887 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.855900 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.855909 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.855922 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.855930 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.855938 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.855944 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.856039 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.856049 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.856059 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.856067 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.856076 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.856087 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.880548 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.920786 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zn8d" event={"ID":"7984b376-029e-465e-893e-f62f047ee418","Type":"ContainerDied","Data":"7c9daf59de1dde193c9dcba747459a10432e0ed84264b4c2f1c7ae043d4be4c7"} Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.920841 4807 scope.go:117] "RemoveContainer" containerID="43436aee429a3e83d476f833f00c7d06382a1694db5d0a1b763cda166725e3d3" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.920838 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zn8d" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.921914 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.922443 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.922776 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.923024 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f8qqm" event={"ID":"16652a33-af22-4522-bd9c-8491bd6ae24f","Type":"ContainerDied","Data":"a984142023d554ae6adb3a3b84f913b88c5120ca2d4e3d28672ff1c57435ea1a"} Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.923118 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f8qqm" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.923637 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.923884 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.924165 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.924404 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.926532 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bl9gg" event={"ID":"4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5","Type":"ContainerDied","Data":"db1fa9aee0ab1fa3ef2464d0f517d57852faea54fc76c75135759423009fd564"} Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.926561 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bl9gg" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.926978 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.927520 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.927819 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.928053 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.928262 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" event={"ID":"c5121ac2-4e63-4d46-b899-89bbfbb19550","Type":"ContainerStarted","Data":"51f679782a1ca666345dca273b13ef5b1a7509fc505ee24d60369dbd85b3e00d"} Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.928287 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" event={"ID":"c5121ac2-4e63-4d46-b899-89bbfbb19550","Type":"ContainerStarted","Data":"37d55ca4e971ed2ae7b397b36011cf3e56332193c49cd12543999b6691a54946"} Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.928284 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.928632 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.928955 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.930228 4807 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-q6pz5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.930371 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.930272 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.930656 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: E1127 11:13:16.930618 4807 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event=< Nov 27 11:13:16 crc kubenswrapper[4807]: &Event{ObjectMeta:{marketplace-operator-79b997595-q6pz5.187bd8bfb4dc1db5 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-q6pz5,UID:c5121ac2-4e63-4d46-b899-89bbfbb19550,APIVersion:v1,ResourceVersion:29447,FieldPath:spec.containers{marketplace-operator},},Reason:ProbeError,Message:Readiness probe error: Get "http://10.217.0.57:8080/healthz": dial tcp 10.217.0.57:8080: connect: connection refused Nov 27 11:13:16 crc kubenswrapper[4807]: body: Nov 27 11:13:16 crc kubenswrapper[4807]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 11:13:16.930358709 +0000 UTC m=+238.029856907,LastTimestamp:2025-11-27 11:13:16.930358709 +0000 UTC m=+238.029856907,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Nov 27 11:13:16 crc kubenswrapper[4807]: > Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.930829 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.931354 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.931559 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.931688 4807 generic.go:334] "Generic (PLEG): container finished" podID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" containerID="a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe" exitCode=0 Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.931731 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhk66" event={"ID":"c61dff77-8482-4e06-b99e-72c1cd18c4ca","Type":"ContainerDied","Data":"a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe"} Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.931751 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vhk66" event={"ID":"c61dff77-8482-4e06-b99e-72c1cd18c4ca","Type":"ContainerDied","Data":"d778404a16cf691295fa1c89bcf6f82b237523bf2078b067653d0b587a01a772"} Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.931816 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vhk66" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.932823 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.933094 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.933336 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.933691 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.933871 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" event={"ID":"65e5ef23-71ad-40ae-81bb-e94d9d298087","Type":"ContainerDied","Data":"b1faebbc18104f2082a1daaa24e48f248b3a7b7f5630a4ee275b60eda9baf4d1"} Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.933891 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.934536 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.934984 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.935840 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.936591 4807 scope.go:117] "RemoveContainer" containerID="71b5b4c6e1fdf96acfbcc0bec52eb31b3f0f5ef4c5d4db9eb5f845da789deb78" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.937521 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.938530 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.938761 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.938989 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.939184 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.939381 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.939574 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.939767 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.940239 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.940464 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.940647 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.940999 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.941570 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.942765 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.943139 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.943415 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.946752 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.948414 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.949189 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.949281 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.949579 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.949595 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.950204 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.950337 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.966111 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.966540 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.967287 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.967573 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.967833 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.969461 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.969661 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.969845 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:16 crc kubenswrapper[4807]: I1127 11:13:16.984333 4807 scope.go:117] "RemoveContainer" containerID="3bc07e9f509d87b31878c2444d977e7420f6191a07d14d7fccca97ee7abc67dc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.011205 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.012314 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.012919 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.013200 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.013464 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.013708 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.013949 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.014220 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.014694 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.014985 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.015408 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.015665 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.015998 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.016264 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.016542 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.017017 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.018964 4807 scope.go:117] "RemoveContainer" containerID="d4489a3629cd407f31cd28c9395f8a2974c5a840f5eef1641449d419bb82ed8c" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.031365 4807 scope.go:117] "RemoveContainer" containerID="62f8d22f042b07bc62620047a44c6324f4ef1f42bd3098efe35cb00fe7c3228b" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.051907 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.051945 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.051965 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052013 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052029 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052047 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052054 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052081 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052089 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052108 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052056 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052126 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052238 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052348 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052412 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.052427 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.169201 4807 scope.go:117] "RemoveContainer" containerID="06292c7cb6be8587c920949db58a1c438c570e7f5c43597a23e707700de7df08" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.174757 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.181015 4807 scope.go:117] "RemoveContainer" containerID="9ac12b95244684cc27613f0aeeb27eb96502f2c4428f581834fc5f68c4b9eda8" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.192804 4807 scope.go:117] "RemoveContainer" containerID="f0c7c74900eb6fedcc44429d2a98cdd1e27b56069ecb54031c294c2f5e4ec3c3" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.212075 4807 scope.go:117] "RemoveContainer" containerID="df88f9a6750fc797ea8b621d3e3fa8b3f79b8c37b54a21e9efdcd52eecfeb675" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.225369 4807 scope.go:117] "RemoveContainer" containerID="a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.242113 4807 scope.go:117] "RemoveContainer" containerID="0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.259652 4807 scope.go:117] "RemoveContainer" containerID="1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.271459 4807 scope.go:117] "RemoveContainer" containerID="a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe" Nov 27 11:13:17 crc kubenswrapper[4807]: E1127 11:13:17.271823 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe\": container with ID starting with a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe not found: ID does not exist" containerID="a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.271867 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe"} err="failed to get container status \"a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe\": rpc error: code = NotFound desc = could not find container \"a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe\": container with ID starting with a4aed7eaa6d4fd1e387daaa33fca50dbc023590adaac39e0638bf473aad02afe not found: ID does not exist" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.271919 4807 scope.go:117] "RemoveContainer" containerID="0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9" Nov 27 11:13:17 crc kubenswrapper[4807]: E1127 11:13:17.272397 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9\": container with ID starting with 0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9 not found: ID does not exist" containerID="0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.272434 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9"} err="failed to get container status \"0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9\": rpc error: code = NotFound desc = could not find container \"0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9\": container with ID starting with 0af6172037604eb53e48d69b8e6f14d65daf61f55ce5f6d0cc190f2c140e70a9 not found: ID does not exist" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.272458 4807 scope.go:117] "RemoveContainer" containerID="1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f" Nov 27 11:13:17 crc kubenswrapper[4807]: E1127 11:13:17.272837 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f\": container with ID starting with 1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f not found: ID does not exist" containerID="1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.272891 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f"} err="failed to get container status \"1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f\": rpc error: code = NotFound desc = could not find container \"1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f\": container with ID starting with 1a2146acc723cc20833dd626417af8ebffe5ab90b10817b282b0d18945e21f6f not found: ID does not exist" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.272924 4807 scope.go:117] "RemoveContainer" containerID="6617f5552a5be9cb1d47e1efef730cfc812a511591b28d13d051ea011529efd3" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.942180 4807 generic.go:334] "Generic (PLEG): container finished" podID="7674389a-8181-488b-bb03-d97eee98df00" containerID="050a36180f9e7911ef9624ecb3576188de7502de8109999d63a45c386e611687" exitCode=0 Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.942284 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7674389a-8181-488b-bb03-d97eee98df00","Type":"ContainerDied","Data":"050a36180f9e7911ef9624ecb3576188de7502de8109999d63a45c386e611687"} Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.943035 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.943428 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.943935 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.944232 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.944626 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.945306 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/0.log" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.945377 4807 generic.go:334] "Generic (PLEG): container finished" podID="c5121ac2-4e63-4d46-b899-89bbfbb19550" containerID="51f679782a1ca666345dca273b13ef5b1a7509fc505ee24d60369dbd85b3e00d" exitCode=1 Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.945295 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.945465 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" event={"ID":"c5121ac2-4e63-4d46-b899-89bbfbb19550","Type":"ContainerDied","Data":"51f679782a1ca666345dca273b13ef5b1a7509fc505ee24d60369dbd85b3e00d"} Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.945831 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.945888 4807 scope.go:117] "RemoveContainer" containerID="51f679782a1ca666345dca273b13ef5b1a7509fc505ee24d60369dbd85b3e00d" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.946342 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.946774 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.947123 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.947672 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.948147 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.948635 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.949101 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.949444 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.950451 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.951234 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.952620 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.953175 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977" exitCode=0 Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.953192 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71" exitCode=0 Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.953199 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b" exitCode=0 Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.953208 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd" exitCode=2 Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.953274 4807 scope.go:117] "RemoveContainer" containerID="08868f802d70f0978217f99a06d02985f4cb011598b44f81c3f26fc41b458f5b" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.954346 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c"} Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.954386 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"02a45de0d9a64430f3fddcb00c8f4a2534d188b7ebadf0e8fa60d33847a88913"} Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.955153 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.955756 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.956036 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.956735 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.957149 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.957589 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.957987 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:17 crc kubenswrapper[4807]: I1127 11:13:17.958471 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.967750 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/1.log" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.968658 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/0.log" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.968704 4807 generic.go:334] "Generic (PLEG): container finished" podID="c5121ac2-4e63-4d46-b899-89bbfbb19550" containerID="516437dec263a9b5ba1839772d70b748df237a7004712cd3a48d289fdc556d8b" exitCode=1 Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.968787 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" event={"ID":"c5121ac2-4e63-4d46-b899-89bbfbb19550","Type":"ContainerDied","Data":"516437dec263a9b5ba1839772d70b748df237a7004712cd3a48d289fdc556d8b"} Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.968827 4807 scope.go:117] "RemoveContainer" containerID="51f679782a1ca666345dca273b13ef5b1a7509fc505ee24d60369dbd85b3e00d" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.969333 4807 scope.go:117] "RemoveContainer" containerID="516437dec263a9b5ba1839772d70b748df237a7004712cd3a48d289fdc556d8b" Nov 27 11:13:18 crc kubenswrapper[4807]: E1127 11:13:18.969623 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-q6pz5_openshift-marketplace(c5121ac2-4e63-4d46-b899-89bbfbb19550)\"" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.970009 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.970217 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.970424 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.970620 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.970808 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.971022 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.971486 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.971906 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:18 crc kubenswrapper[4807]: I1127 11:13:18.973783 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.212455 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.213909 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.214548 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.214953 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.215173 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.224357 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.224707 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.224989 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.225180 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.225395 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.225574 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.278832 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.278883 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.278929 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.278954 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.278979 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.279102 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.279145 4807 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.279235 4807 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.299588 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.300175 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.300677 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.301153 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.301422 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.301712 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.301972 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.302163 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.302400 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.302630 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: E1127 11:13:19.370649 4807 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: E1127 11:13:19.370962 4807 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: E1127 11:13:19.371236 4807 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: E1127 11:13:19.371528 4807 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: E1127 11:13:19.371737 4807 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.371763 4807 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Nov 27 11:13:19 crc kubenswrapper[4807]: E1127 11:13:19.371931 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="200ms" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.380353 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-kubelet-dir\") pod \"7674389a-8181-488b-bb03-d97eee98df00\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.380404 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7674389a-8181-488b-bb03-d97eee98df00" (UID: "7674389a-8181-488b-bb03-d97eee98df00"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.380479 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-var-lock\") pod \"7674389a-8181-488b-bb03-d97eee98df00\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.380508 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7674389a-8181-488b-bb03-d97eee98df00-kube-api-access\") pod \"7674389a-8181-488b-bb03-d97eee98df00\" (UID: \"7674389a-8181-488b-bb03-d97eee98df00\") " Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.380599 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-var-lock" (OuterVolumeSpecName: "var-lock") pod "7674389a-8181-488b-bb03-d97eee98df00" (UID: "7674389a-8181-488b-bb03-d97eee98df00"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.380763 4807 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.380785 4807 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.380793 4807 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/7674389a-8181-488b-bb03-d97eee98df00-var-lock\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.385052 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7674389a-8181-488b-bb03-d97eee98df00-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7674389a-8181-488b-bb03-d97eee98df00" (UID: "7674389a-8181-488b-bb03-d97eee98df00"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.481441 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7674389a-8181-488b-bb03-d97eee98df00-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.535597 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.535865 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.536235 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.536525 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.536818 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.537046 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.537402 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.538331 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.538634 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.539536 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Nov 27 11:13:19 crc kubenswrapper[4807]: E1127 11:13:19.573086 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="400ms" Nov 27 11:13:19 crc kubenswrapper[4807]: E1127 11:13:19.973469 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="800ms" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.980960 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/1.log" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.981407 4807 scope.go:117] "RemoveContainer" containerID="516437dec263a9b5ba1839772d70b748df237a7004712cd3a48d289fdc556d8b" Nov 27 11:13:19 crc kubenswrapper[4807]: E1127 11:13:19.981638 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-q6pz5_openshift-marketplace(c5121ac2-4e63-4d46-b899-89bbfbb19550)\"" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.981725 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.982070 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.982413 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.982463 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"7674389a-8181-488b-bb03-d97eee98df00","Type":"ContainerDied","Data":"c22ecf272f90841640c72ed319cff6d766eca0d5af8f1ae220c413145d70dd4b"} Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.982601 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c22ecf272f90841640c72ed319cff6d766eca0d5af8f1ae220c413145d70dd4b" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.982502 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.983030 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.983306 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.983529 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.983740 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.983950 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.986182 4807 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b" exitCode=0 Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.986226 4807 scope.go:117] "RemoveContainer" containerID="e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.986351 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.986733 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.986896 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.987092 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.987238 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.987504 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.988049 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.988328 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.988563 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:19 crc kubenswrapper[4807]: I1127 11:13:19.988793 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.000393 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.000552 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.001227 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.001493 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.001870 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.002104 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.002298 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.002451 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.002590 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.002761 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.002899 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.003037 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.003172 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.003561 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.003794 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.004060 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.004385 4807 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.004804 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.007539 4807 scope.go:117] "RemoveContainer" containerID="a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.020597 4807 scope.go:117] "RemoveContainer" containerID="241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.034771 4807 scope.go:117] "RemoveContainer" containerID="9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.046191 4807 scope.go:117] "RemoveContainer" containerID="48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.062457 4807 scope.go:117] "RemoveContainer" containerID="f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.079452 4807 scope.go:117] "RemoveContainer" containerID="e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977" Nov 27 11:13:20 crc kubenswrapper[4807]: E1127 11:13:20.079847 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\": container with ID starting with e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977 not found: ID does not exist" containerID="e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.079971 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977"} err="failed to get container status \"e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\": rpc error: code = NotFound desc = could not find container \"e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977\": container with ID starting with e839ca01d5d377ea361fd0e390b1f4c97a4e4da103bfbeaf29711bacc36cf977 not found: ID does not exist" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.080058 4807 scope.go:117] "RemoveContainer" containerID="a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71" Nov 27 11:13:20 crc kubenswrapper[4807]: E1127 11:13:20.080736 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\": container with ID starting with a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71 not found: ID does not exist" containerID="a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.080767 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71"} err="failed to get container status \"a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\": rpc error: code = NotFound desc = could not find container \"a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71\": container with ID starting with a84317834594831f35c82303125b6ad8e73e5527a060c7aa6d4c59f47ae97f71 not found: ID does not exist" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.080789 4807 scope.go:117] "RemoveContainer" containerID="241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b" Nov 27 11:13:20 crc kubenswrapper[4807]: E1127 11:13:20.080967 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\": container with ID starting with 241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b not found: ID does not exist" containerID="241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.080985 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b"} err="failed to get container status \"241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\": rpc error: code = NotFound desc = could not find container \"241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b\": container with ID starting with 241598eda9fe4cebc6eed5f0d5ad20d96a8ba35aaf0e5c3661fa807dcaf4f52b not found: ID does not exist" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.080999 4807 scope.go:117] "RemoveContainer" containerID="9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd" Nov 27 11:13:20 crc kubenswrapper[4807]: E1127 11:13:20.081173 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\": container with ID starting with 9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd not found: ID does not exist" containerID="9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.081193 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd"} err="failed to get container status \"9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\": rpc error: code = NotFound desc = could not find container \"9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd\": container with ID starting with 9a6b4660f87bf7e762d53dfd06c627ae31eaaf66c8422259f51d4373e34c25dd not found: ID does not exist" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.081205 4807 scope.go:117] "RemoveContainer" containerID="48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b" Nov 27 11:13:20 crc kubenswrapper[4807]: E1127 11:13:20.082630 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\": container with ID starting with 48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b not found: ID does not exist" containerID="48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.082666 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b"} err="failed to get container status \"48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\": rpc error: code = NotFound desc = could not find container \"48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b\": container with ID starting with 48bb228e576da6eeb571aa65bb5fd8076d0714fd6a3f560d85b8ba9d36c5f82b not found: ID does not exist" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.082687 4807 scope.go:117] "RemoveContainer" containerID="f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156" Nov 27 11:13:20 crc kubenswrapper[4807]: E1127 11:13:20.082931 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\": container with ID starting with f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156 not found: ID does not exist" containerID="f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156" Nov 27 11:13:20 crc kubenswrapper[4807]: I1127 11:13:20.082969 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156"} err="failed to get container status \"f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\": rpc error: code = NotFound desc = could not find container \"f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156\": container with ID starting with f4024bbfda7544d3169e326c54fc9f72b225078165e81a13a690e016f9fe9156 not found: ID does not exist" Nov 27 11:13:20 crc kubenswrapper[4807]: E1127 11:13:20.775296 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="1.6s" Nov 27 11:13:22 crc kubenswrapper[4807]: E1127 11:13:22.376086 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="3.2s" Nov 27 11:13:25 crc kubenswrapper[4807]: E1127 11:13:25.577469 4807 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.204:6443: connect: connection refused" interval="6.4s" Nov 27 11:13:26 crc kubenswrapper[4807]: I1127 11:13:26.195338 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:26 crc kubenswrapper[4807]: I1127 11:13:26.195424 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:26 crc kubenswrapper[4807]: I1127 11:13:26.196121 4807 scope.go:117] "RemoveContainer" containerID="516437dec263a9b5ba1839772d70b748df237a7004712cd3a48d289fdc556d8b" Nov 27 11:13:26 crc kubenswrapper[4807]: E1127 11:13:26.196569 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-q6pz5_openshift-marketplace(c5121ac2-4e63-4d46-b899-89bbfbb19550)\"" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" Nov 27 11:13:26 crc kubenswrapper[4807]: E1127 11:13:26.494127 4807 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.204:6443: connect: connection refused" event=< Nov 27 11:13:26 crc kubenswrapper[4807]: &Event{ObjectMeta:{marketplace-operator-79b997595-q6pz5.187bd8bfb4dc1db5 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:marketplace-operator-79b997595-q6pz5,UID:c5121ac2-4e63-4d46-b899-89bbfbb19550,APIVersion:v1,ResourceVersion:29447,FieldPath:spec.containers{marketplace-operator},},Reason:ProbeError,Message:Readiness probe error: Get "http://10.217.0.57:8080/healthz": dial tcp 10.217.0.57:8080: connect: connection refused Nov 27 11:13:26 crc kubenswrapper[4807]: body: Nov 27 11:13:26 crc kubenswrapper[4807]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-27 11:13:16.930358709 +0000 UTC m=+238.029856907,LastTimestamp:2025-11-27 11:13:16.930358709 +0000 UTC m=+238.029856907,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Nov 27 11:13:26 crc kubenswrapper[4807]: > Nov 27 11:13:29 crc kubenswrapper[4807]: I1127 11:13:29.534369 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:29 crc kubenswrapper[4807]: I1127 11:13:29.535076 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:29 crc kubenswrapper[4807]: I1127 11:13:29.535495 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:29 crc kubenswrapper[4807]: I1127 11:13:29.535963 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:29 crc kubenswrapper[4807]: I1127 11:13:29.536510 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:29 crc kubenswrapper[4807]: I1127 11:13:29.537030 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:29 crc kubenswrapper[4807]: I1127 11:13:29.537543 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:29 crc kubenswrapper[4807]: I1127 11:13:29.537993 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.531793 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.533303 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.533859 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.534467 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.535050 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.535797 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.536152 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.536553 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.536864 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.554669 4807 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="05153e77-990e-4b38-89e3-d4f962674fa9" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.554724 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="05153e77-990e-4b38-89e3-d4f962674fa9" Nov 27 11:13:30 crc kubenswrapper[4807]: E1127 11:13:30.555382 4807 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:30 crc kubenswrapper[4807]: I1127 11:13:30.556052 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:30 crc kubenswrapper[4807]: W1127 11:13:30.583740 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-485475fa7ca86b275e62199ae274bab44b40b3c63d92a47047bbf61709fc44b0 WatchSource:0}: Error finding container 485475fa7ca86b275e62199ae274bab44b40b3c63d92a47047bbf61709fc44b0: Status 404 returned error can't find the container with id 485475fa7ca86b275e62199ae274bab44b40b3c63d92a47047bbf61709fc44b0 Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.056817 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.057103 4807 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d" exitCode=1 Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.057160 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d"} Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.058120 4807 scope.go:117] "RemoveContainer" containerID="b277fe2d8c084ec426a5981524258866bed52754b559d0797b89a8b6e4d49b8d" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.058788 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.059332 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.059635 4807 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="38508cc29ee1a4da27c671e3ef7e303c2f9ec1aa5f39cca486331105c9a39eca" exitCode=0 Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.059674 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"38508cc29ee1a4da27c671e3ef7e303c2f9ec1aa5f39cca486331105c9a39eca"} Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.059708 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"485475fa7ca86b275e62199ae274bab44b40b3c63d92a47047bbf61709fc44b0"} Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.059955 4807 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="05153e77-990e-4b38-89e3-d4f962674fa9" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.059970 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="05153e77-990e-4b38-89e3-d4f962674fa9" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.060021 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: E1127 11:13:31.060339 4807 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.060558 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.060805 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.061013 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.061217 4807 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.061495 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.062001 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.062523 4807 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.063061 4807 status_manager.go:851] "Failed to get status for pod" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-q6pz5\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.063538 4807 status_manager.go:851] "Failed to get status for pod" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" pod="openshift-marketplace/redhat-operators-vhk66" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-vhk66\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.063916 4807 status_manager.go:851] "Failed to get status for pod" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" pod="openshift-marketplace/redhat-marketplace-bl9gg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-bl9gg\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.064384 4807 status_manager.go:851] "Failed to get status for pod" podUID="7984b376-029e-465e-893e-f62f047ee418" pod="openshift-marketplace/community-operators-4zn8d" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-4zn8d\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.064753 4807 status_manager.go:851] "Failed to get status for pod" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" pod="openshift-marketplace/marketplace-operator-79b997595-82ntv" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/marketplace-operator-79b997595-82ntv\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.065087 4807 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.065786 4807 status_manager.go:851] "Failed to get status for pod" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" pod="openshift-marketplace/certified-operators-f8qqm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-f8qqm\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:31 crc kubenswrapper[4807]: I1127 11:13:31.066153 4807 status_manager.go:851] "Failed to get status for pod" podUID="7674389a-8181-488b-bb03-d97eee98df00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.204:6443: connect: connection refused" Nov 27 11:13:32 crc kubenswrapper[4807]: I1127 11:13:32.069816 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Nov 27 11:13:32 crc kubenswrapper[4807]: I1127 11:13:32.070192 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"01e7237a6286d18e6fecb00ae3b820bbf1590fee23f5a8b82c2158a49a5f559a"} Nov 27 11:13:32 crc kubenswrapper[4807]: I1127 11:13:32.073325 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3fdb2150aedd34a4954a3e2cfb700f956bb2c0630393be8ae906a54c945b523f"} Nov 27 11:13:32 crc kubenswrapper[4807]: I1127 11:13:32.073384 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"057d28dbed422f9560c6a7f3034ab9f465c15c315a476122cf5f4bc8cf2d7b61"} Nov 27 11:13:32 crc kubenswrapper[4807]: I1127 11:13:32.073397 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9fa907ca0a880aa8a527f777cb3f7c5a963be4a891bcdab73540cf5cb92847bd"} Nov 27 11:13:32 crc kubenswrapper[4807]: I1127 11:13:32.073407 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2e9cd84e1900f653fa6ef66a32b26b55108c2db74277cfeda60efd8d4a9039c4"} Nov 27 11:13:33 crc kubenswrapper[4807]: I1127 11:13:33.085119 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"118b381bdb6e0e65fe5f3545ad245778e72a0c814146b95db049cd7fcbd3c7e7"} Nov 27 11:13:33 crc kubenswrapper[4807]: I1127 11:13:33.085904 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:33 crc kubenswrapper[4807]: I1127 11:13:33.085634 4807 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="05153e77-990e-4b38-89e3-d4f962674fa9" Nov 27 11:13:33 crc kubenswrapper[4807]: I1127 11:13:33.086238 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="05153e77-990e-4b38-89e3-d4f962674fa9" Nov 27 11:13:35 crc kubenswrapper[4807]: I1127 11:13:35.557235 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:35 crc kubenswrapper[4807]: I1127 11:13:35.557310 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:35 crc kubenswrapper[4807]: I1127 11:13:35.566107 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:38 crc kubenswrapper[4807]: I1127 11:13:38.094821 4807 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:39 crc kubenswrapper[4807]: I1127 11:13:39.120636 4807 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="05153e77-990e-4b38-89e3-d4f962674fa9" Nov 27 11:13:39 crc kubenswrapper[4807]: I1127 11:13:39.120675 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="05153e77-990e-4b38-89e3-d4f962674fa9" Nov 27 11:13:39 crc kubenswrapper[4807]: I1127 11:13:39.124418 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:39 crc kubenswrapper[4807]: I1127 11:13:39.478330 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:13:39 crc kubenswrapper[4807]: I1127 11:13:39.549716 4807 scope.go:117] "RemoveContainer" containerID="516437dec263a9b5ba1839772d70b748df237a7004712cd3a48d289fdc556d8b" Nov 27 11:13:39 crc kubenswrapper[4807]: I1127 11:13:39.551747 4807 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7f8a5a01-b235-4855-b9d0-d31fa820d991" Nov 27 11:13:39 crc kubenswrapper[4807]: I1127 11:13:39.711567 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:13:39 crc kubenswrapper[4807]: I1127 11:13:39.716362 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:13:40 crc kubenswrapper[4807]: I1127 11:13:40.126267 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/2.log" Nov 27 11:13:40 crc kubenswrapper[4807]: I1127 11:13:40.127532 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/1.log" Nov 27 11:13:40 crc kubenswrapper[4807]: I1127 11:13:40.127655 4807 generic.go:334] "Generic (PLEG): container finished" podID="c5121ac2-4e63-4d46-b899-89bbfbb19550" containerID="e36e8d8324a277a4814f0fe728cebcb75faefae028c7af52e6b752273d907248" exitCode=1 Nov 27 11:13:40 crc kubenswrapper[4807]: I1127 11:13:40.127707 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" event={"ID":"c5121ac2-4e63-4d46-b899-89bbfbb19550","Type":"ContainerDied","Data":"e36e8d8324a277a4814f0fe728cebcb75faefae028c7af52e6b752273d907248"} Nov 27 11:13:40 crc kubenswrapper[4807]: I1127 11:13:40.127813 4807 scope.go:117] "RemoveContainer" containerID="516437dec263a9b5ba1839772d70b748df237a7004712cd3a48d289fdc556d8b" Nov 27 11:13:40 crc kubenswrapper[4807]: I1127 11:13:40.128172 4807 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="05153e77-990e-4b38-89e3-d4f962674fa9" Nov 27 11:13:40 crc kubenswrapper[4807]: I1127 11:13:40.128190 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="05153e77-990e-4b38-89e3-d4f962674fa9" Nov 27 11:13:40 crc kubenswrapper[4807]: I1127 11:13:40.128621 4807 scope.go:117] "RemoveContainer" containerID="e36e8d8324a277a4814f0fe728cebcb75faefae028c7af52e6b752273d907248" Nov 27 11:13:40 crc kubenswrapper[4807]: E1127 11:13:40.128861 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-q6pz5_openshift-marketplace(c5121ac2-4e63-4d46-b899-89bbfbb19550)\"" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" Nov 27 11:13:40 crc kubenswrapper[4807]: I1127 11:13:40.130815 4807 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7f8a5a01-b235-4855-b9d0-d31fa820d991" Nov 27 11:13:41 crc kubenswrapper[4807]: I1127 11:13:41.134043 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/2.log" Nov 27 11:13:41 crc kubenswrapper[4807]: I1127 11:13:41.141452 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 27 11:13:46 crc kubenswrapper[4807]: I1127 11:13:46.194713 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:46 crc kubenswrapper[4807]: I1127 11:13:46.195428 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:13:46 crc kubenswrapper[4807]: I1127 11:13:46.196094 4807 scope.go:117] "RemoveContainer" containerID="e36e8d8324a277a4814f0fe728cebcb75faefae028c7af52e6b752273d907248" Nov 27 11:13:46 crc kubenswrapper[4807]: E1127 11:13:46.196551 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-q6pz5_openshift-marketplace(c5121ac2-4e63-4d46-b899-89bbfbb19550)\"" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" Nov 27 11:13:47 crc kubenswrapper[4807]: I1127 11:13:47.167429 4807 scope.go:117] "RemoveContainer" containerID="e36e8d8324a277a4814f0fe728cebcb75faefae028c7af52e6b752273d907248" Nov 27 11:13:47 crc kubenswrapper[4807]: E1127 11:13:47.167753 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-q6pz5_openshift-marketplace(c5121ac2-4e63-4d46-b899-89bbfbb19550)\"" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" Nov 27 11:13:47 crc kubenswrapper[4807]: I1127 11:13:47.679591 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 27 11:13:47 crc kubenswrapper[4807]: I1127 11:13:47.798897 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 27 11:13:48 crc kubenswrapper[4807]: I1127 11:13:48.133385 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 27 11:13:48 crc kubenswrapper[4807]: I1127 11:13:48.280163 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 27 11:13:48 crc kubenswrapper[4807]: I1127 11:13:48.707268 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 27 11:13:48 crc kubenswrapper[4807]: I1127 11:13:48.968833 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 27 11:13:49 crc kubenswrapper[4807]: I1127 11:13:49.498011 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 27 11:13:49 crc kubenswrapper[4807]: I1127 11:13:49.503602 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 27 11:13:49 crc kubenswrapper[4807]: I1127 11:13:49.935852 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 27 11:13:49 crc kubenswrapper[4807]: I1127 11:13:49.937434 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 27 11:13:50 crc kubenswrapper[4807]: I1127 11:13:50.065797 4807 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 27 11:13:50 crc kubenswrapper[4807]: I1127 11:13:50.150124 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 27 11:13:50 crc kubenswrapper[4807]: I1127 11:13:50.340301 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 27 11:13:50 crc kubenswrapper[4807]: I1127 11:13:50.570219 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 27 11:13:50 crc kubenswrapper[4807]: I1127 11:13:50.584829 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 27 11:13:50 crc kubenswrapper[4807]: I1127 11:13:50.844557 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 27 11:13:50 crc kubenswrapper[4807]: I1127 11:13:50.933707 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.037800 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.186050 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.214538 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.277481 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.330230 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.396056 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.448502 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.506574 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.679362 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.925883 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 27 11:13:51 crc kubenswrapper[4807]: I1127 11:13:51.946929 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.028143 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.126168 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.157942 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.172760 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.195121 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.217842 4807 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.227182 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.400310 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.480929 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.539634 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.693181 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.729040 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.822187 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.876691 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.883019 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 27 11:13:52 crc kubenswrapper[4807]: I1127 11:13:52.944499 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.211646 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.222805 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.293400 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.295338 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.317741 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.340951 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.391966 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.415846 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.485040 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.489281 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.660642 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.755688 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.772337 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.858414 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.990538 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 27 11:13:53 crc kubenswrapper[4807]: I1127 11:13:53.998945 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.000842 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.058119 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.059268 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.066648 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.083060 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.098094 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.116331 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.141532 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.163009 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.189512 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.201008 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.204825 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.207896 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.349490 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.476544 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.514220 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.573980 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.679901 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.786148 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.934225 4807 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 27 11:13:54 crc kubenswrapper[4807]: I1127 11:13:54.967662 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.115687 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.163825 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.267711 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.269357 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.328442 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.382699 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.404359 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.438604 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.649062 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.735590 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.740288 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.825273 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.894104 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.921907 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.972218 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.974083 4807 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 27 11:13:55 crc kubenswrapper[4807]: I1127 11:13:55.985678 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.049902 4807 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.051660 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.051634794 podStartE2EDuration="40.051634794s" podCreationTimestamp="2025-11-27 11:13:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:13:37.582856261 +0000 UTC m=+258.682354459" watchObservedRunningTime="2025-11-27 11:13:56.051634794 +0000 UTC m=+277.151133002" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.051882 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.053717 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.062287 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f8qqm","openshift-marketplace/community-operators-4zn8d","openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/marketplace-operator-79b997595-82ntv","openshift-marketplace/redhat-marketplace-bl9gg","openshift-marketplace/redhat-operators-vhk66"] Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.062414 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.073229 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.091068 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.091051473 podStartE2EDuration="18.091051473s" podCreationTimestamp="2025-11-27 11:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:13:56.089760336 +0000 UTC m=+277.189258534" watchObservedRunningTime="2025-11-27 11:13:56.091051473 +0000 UTC m=+277.190549671" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.186381 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.237590 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.314894 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.318295 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.440590 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.470709 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.477812 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.486170 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.499231 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.500895 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.585448 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.654975 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.695520 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.745743 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.760903 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.905053 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.950571 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.953936 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 27 11:13:56 crc kubenswrapper[4807]: I1127 11:13:56.989959 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.047444 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.070510 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.157814 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.312626 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.315505 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.537067 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.539564 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16652a33-af22-4522-bd9c-8491bd6ae24f" path="/var/lib/kubelet/pods/16652a33-af22-4522-bd9c-8491bd6ae24f/volumes" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.540459 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5" path="/var/lib/kubelet/pods/4a8d58fc-68ed-46cd-bd2d-23ac69cac2b5/volumes" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.541091 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e5ef23-71ad-40ae-81bb-e94d9d298087" path="/var/lib/kubelet/pods/65e5ef23-71ad-40ae-81bb-e94d9d298087/volumes" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.541983 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7984b376-029e-465e-893e-f62f047ee418" path="/var/lib/kubelet/pods/7984b376-029e-465e-893e-f62f047ee418/volumes" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.542558 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c61dff77-8482-4e06-b99e-72c1cd18c4ca" path="/var/lib/kubelet/pods/c61dff77-8482-4e06-b99e-72c1cd18c4ca/volumes" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.609098 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.688864 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.705336 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.708213 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.770399 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.807586 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.858973 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.886574 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 27 11:13:57 crc kubenswrapper[4807]: I1127 11:13:57.892116 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.021230 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.079215 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.088108 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.248570 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.345760 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.398397 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.466996 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.473463 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.481445 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.483862 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.532663 4807 scope.go:117] "RemoveContainer" containerID="e36e8d8324a277a4814f0fe728cebcb75faefae028c7af52e6b752273d907248" Nov 27 11:13:58 crc kubenswrapper[4807]: E1127 11:13:58.532895 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-q6pz5_openshift-marketplace(c5121ac2-4e63-4d46-b899-89bbfbb19550)\"" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" podUID="c5121ac2-4e63-4d46-b899-89bbfbb19550" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.559001 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.560815 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.591623 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.600149 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.610614 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.627307 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.764437 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.863891 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.943983 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 27 11:13:58 crc kubenswrapper[4807]: I1127 11:13:58.978259 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.074648 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.096237 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.129554 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.158137 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.188806 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.192061 4807 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.252121 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.269622 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.273344 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.343375 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.365648 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.388091 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.414532 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.425725 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.461506 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.476473 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.494901 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.545658 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.594150 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.615616 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.620271 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.696620 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.701833 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.746411 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.828997 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.829873 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 27 11:13:59 crc kubenswrapper[4807]: I1127 11:13:59.904194 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.003060 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.320345 4807 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.320614 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c" gracePeriod=5 Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.345570 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.382202 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.390459 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.411939 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.422272 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.466032 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.551995 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.587711 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.591607 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.597449 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.609369 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.636032 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.651280 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.746132 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.799350 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 27 11:14:00 crc kubenswrapper[4807]: I1127 11:14:00.822396 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.036016 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.062766 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.173335 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.265631 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.276456 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.281287 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.352996 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.366965 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.570065 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.603824 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.676440 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.706669 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.918851 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 27 11:14:01 crc kubenswrapper[4807]: I1127 11:14:01.999832 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.176014 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.184001 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.193561 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.270260 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.322635 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.327013 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.370119 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.570560 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.602181 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.609884 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.720584 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.765422 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.863671 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.926987 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 27 11:14:02 crc kubenswrapper[4807]: I1127 11:14:02.962613 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.032066 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.130269 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.184191 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.197701 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.252957 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.264918 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.315717 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.400155 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.471900 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.748820 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.963042 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 27 11:14:03 crc kubenswrapper[4807]: I1127 11:14:03.979806 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 27 11:14:04 crc kubenswrapper[4807]: I1127 11:14:04.150062 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 27 11:14:04 crc kubenswrapper[4807]: I1127 11:14:04.238146 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 27 11:14:04 crc kubenswrapper[4807]: I1127 11:14:04.493191 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 27 11:14:04 crc kubenswrapper[4807]: I1127 11:14:04.628102 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 27 11:14:04 crc kubenswrapper[4807]: I1127 11:14:04.656186 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 27 11:14:04 crc kubenswrapper[4807]: I1127 11:14:04.676799 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 27 11:14:04 crc kubenswrapper[4807]: I1127 11:14:04.784552 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.232474 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.449143 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.449224 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538473 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538512 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538533 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538577 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538586 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538614 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538642 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538808 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538891 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538977 4807 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538990 4807 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.538998 4807 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.539006 4807 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.540019 4807 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.548088 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.554361 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.554434 4807 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="b777c24e-b522-49c8-8798-dd66006344d7" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.558206 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.558274 4807 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="b777c24e-b522-49c8-8798-dd66006344d7" Nov 27 11:14:05 crc kubenswrapper[4807]: I1127 11:14:05.640478 4807 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:06 crc kubenswrapper[4807]: I1127 11:14:06.253208 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Nov 27 11:14:06 crc kubenswrapper[4807]: I1127 11:14:06.253300 4807 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c" exitCode=137 Nov 27 11:14:06 crc kubenswrapper[4807]: I1127 11:14:06.253351 4807 scope.go:117] "RemoveContainer" containerID="d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c" Nov 27 11:14:06 crc kubenswrapper[4807]: I1127 11:14:06.253389 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Nov 27 11:14:06 crc kubenswrapper[4807]: I1127 11:14:06.271114 4807 scope.go:117] "RemoveContainer" containerID="d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c" Nov 27 11:14:06 crc kubenswrapper[4807]: E1127 11:14:06.271505 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c\": container with ID starting with d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c not found: ID does not exist" containerID="d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c" Nov 27 11:14:06 crc kubenswrapper[4807]: I1127 11:14:06.271564 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c"} err="failed to get container status \"d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c\": rpc error: code = NotFound desc = could not find container \"d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c\": container with ID starting with d4f2bfa6c7a2867cb87bcf3419dfee6ad3e70bd2f4502bedd7c2608357e3737c not found: ID does not exist" Nov 27 11:14:07 crc kubenswrapper[4807]: I1127 11:14:07.538499 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Nov 27 11:14:13 crc kubenswrapper[4807]: I1127 11:14:13.531963 4807 scope.go:117] "RemoveContainer" containerID="e36e8d8324a277a4814f0fe728cebcb75faefae028c7af52e6b752273d907248" Nov 27 11:14:14 crc kubenswrapper[4807]: I1127 11:14:14.310498 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/2.log" Nov 27 11:14:14 crc kubenswrapper[4807]: I1127 11:14:14.310886 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" event={"ID":"c5121ac2-4e63-4d46-b899-89bbfbb19550","Type":"ContainerStarted","Data":"352be91789c3cadf6c685ac69ca2b52603a60462e6146f4b63bff08b7734bb82"} Nov 27 11:14:14 crc kubenswrapper[4807]: I1127 11:14:14.311372 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:14:14 crc kubenswrapper[4807]: I1127 11:14:14.314224 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" Nov 27 11:14:14 crc kubenswrapper[4807]: I1127 11:14:14.332496 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-q6pz5" podStartSLOduration=59.332473064 podStartE2EDuration="59.332473064s" podCreationTimestamp="2025-11-27 11:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:14:14.3270859 +0000 UTC m=+295.426584108" watchObservedRunningTime="2025-11-27 11:14:14.332473064 +0000 UTC m=+295.431971282" Nov 27 11:14:27 crc kubenswrapper[4807]: I1127 11:14:27.354407 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4kzvt"] Nov 27 11:14:27 crc kubenswrapper[4807]: I1127 11:14:27.355067 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" podUID="e38d5152-81eb-46c2-9753-84286838528f" containerName="controller-manager" containerID="cri-o://fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59" gracePeriod=30 Nov 27 11:14:27 crc kubenswrapper[4807]: I1127 11:14:27.449260 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z"] Nov 27 11:14:27 crc kubenswrapper[4807]: I1127 11:14:27.449454 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" podUID="648ab531-aea8-438e-9a6a-f827594e98b4" containerName="route-controller-manager" containerID="cri-o://b69ce7230ed8711a77515a9608bc519b5fe92f85a9bd5ff61d9d6364bd5955aa" gracePeriod=30 Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.315679 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.405597 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" event={"ID":"648ab531-aea8-438e-9a6a-f827594e98b4","Type":"ContainerDied","Data":"b69ce7230ed8711a77515a9608bc519b5fe92f85a9bd5ff61d9d6364bd5955aa"} Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.405578 4807 generic.go:334] "Generic (PLEG): container finished" podID="648ab531-aea8-438e-9a6a-f827594e98b4" containerID="b69ce7230ed8711a77515a9608bc519b5fe92f85a9bd5ff61d9d6364bd5955aa" exitCode=0 Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.405683 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" event={"ID":"648ab531-aea8-438e-9a6a-f827594e98b4","Type":"ContainerDied","Data":"4e06ad737ce76c08ad2c3b69ed067c2fb1c4faff4a1811a81755b88418389486"} Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.405695 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e06ad737ce76c08ad2c3b69ed067c2fb1c4faff4a1811a81755b88418389486" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.406633 4807 generic.go:334] "Generic (PLEG): container finished" podID="e38d5152-81eb-46c2-9753-84286838528f" containerID="fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59" exitCode=0 Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.406655 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" event={"ID":"e38d5152-81eb-46c2-9753-84286838528f","Type":"ContainerDied","Data":"fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59"} Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.406672 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" event={"ID":"e38d5152-81eb-46c2-9753-84286838528f","Type":"ContainerDied","Data":"c0027e9f2fecac93984c1920b260ec0843f99c4c12499013c66c65798f017b15"} Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.406691 4807 scope.go:117] "RemoveContainer" containerID="fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.406805 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4kzvt" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.428415 4807 scope.go:117] "RemoveContainer" containerID="fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59" Nov 27 11:14:28 crc kubenswrapper[4807]: E1127 11:14:28.428764 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59\": container with ID starting with fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59 not found: ID does not exist" containerID="fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.428790 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59"} err="failed to get container status \"fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59\": rpc error: code = NotFound desc = could not find container \"fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59\": container with ID starting with fc0fb51bc5f9ece22696dd689c822e107ba5b01e4d5af5140e25d17c86315d59 not found: ID does not exist" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.429535 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.517334 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles\") pod \"e38d5152-81eb-46c2-9753-84286838528f\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.517389 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tk46\" (UniqueName: \"kubernetes.io/projected/e38d5152-81eb-46c2-9753-84286838528f-kube-api-access-2tk46\") pod \"e38d5152-81eb-46c2-9753-84286838528f\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.517416 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-config\") pod \"e38d5152-81eb-46c2-9753-84286838528f\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.517453 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38d5152-81eb-46c2-9753-84286838528f-serving-cert\") pod \"e38d5152-81eb-46c2-9753-84286838528f\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.517480 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca\") pod \"e38d5152-81eb-46c2-9753-84286838528f\" (UID: \"e38d5152-81eb-46c2-9753-84286838528f\") " Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.518372 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca" (OuterVolumeSpecName: "client-ca") pod "e38d5152-81eb-46c2-9753-84286838528f" (UID: "e38d5152-81eb-46c2-9753-84286838528f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.518417 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e38d5152-81eb-46c2-9753-84286838528f" (UID: "e38d5152-81eb-46c2-9753-84286838528f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.518438 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-config" (OuterVolumeSpecName: "config") pod "e38d5152-81eb-46c2-9753-84286838528f" (UID: "e38d5152-81eb-46c2-9753-84286838528f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.526731 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e38d5152-81eb-46c2-9753-84286838528f-kube-api-access-2tk46" (OuterVolumeSpecName: "kube-api-access-2tk46") pod "e38d5152-81eb-46c2-9753-84286838528f" (UID: "e38d5152-81eb-46c2-9753-84286838528f"). InnerVolumeSpecName "kube-api-access-2tk46". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.533616 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e38d5152-81eb-46c2-9753-84286838528f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e38d5152-81eb-46c2-9753-84286838528f" (UID: "e38d5152-81eb-46c2-9753-84286838528f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.618903 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-client-ca\") pod \"648ab531-aea8-438e-9a6a-f827594e98b4\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.619131 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjj7h\" (UniqueName: \"kubernetes.io/projected/648ab531-aea8-438e-9a6a-f827594e98b4-kube-api-access-qjj7h\") pod \"648ab531-aea8-438e-9a6a-f827594e98b4\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.619206 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-config\") pod \"648ab531-aea8-438e-9a6a-f827594e98b4\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.619275 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/648ab531-aea8-438e-9a6a-f827594e98b4-serving-cert\") pod \"648ab531-aea8-438e-9a6a-f827594e98b4\" (UID: \"648ab531-aea8-438e-9a6a-f827594e98b4\") " Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.619604 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e38d5152-81eb-46c2-9753-84286838528f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.619627 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.619644 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.619661 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tk46\" (UniqueName: \"kubernetes.io/projected/e38d5152-81eb-46c2-9753-84286838528f-kube-api-access-2tk46\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.619672 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e38d5152-81eb-46c2-9753-84286838528f-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.619863 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-client-ca" (OuterVolumeSpecName: "client-ca") pod "648ab531-aea8-438e-9a6a-f827594e98b4" (UID: "648ab531-aea8-438e-9a6a-f827594e98b4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.619935 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-config" (OuterVolumeSpecName: "config") pod "648ab531-aea8-438e-9a6a-f827594e98b4" (UID: "648ab531-aea8-438e-9a6a-f827594e98b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.626719 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/648ab531-aea8-438e-9a6a-f827594e98b4-kube-api-access-qjj7h" (OuterVolumeSpecName: "kube-api-access-qjj7h") pod "648ab531-aea8-438e-9a6a-f827594e98b4" (UID: "648ab531-aea8-438e-9a6a-f827594e98b4"). InnerVolumeSpecName "kube-api-access-qjj7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.628023 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/648ab531-aea8-438e-9a6a-f827594e98b4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "648ab531-aea8-438e-9a6a-f827594e98b4" (UID: "648ab531-aea8-438e-9a6a-f827594e98b4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.720806 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.720836 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjj7h\" (UniqueName: \"kubernetes.io/projected/648ab531-aea8-438e-9a6a-f827594e98b4-kube-api-access-qjj7h\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.720845 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/648ab531-aea8-438e-9a6a-f827594e98b4-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.720855 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/648ab531-aea8-438e-9a6a-f827594e98b4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.733503 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4kzvt"] Nov 27 11:14:28 crc kubenswrapper[4807]: I1127 11:14:28.736796 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4kzvt"] Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.151039 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6649f4cd5-qrrrl"] Nov 27 11:14:29 crc kubenswrapper[4807]: E1127 11:14:29.151463 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.151493 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 27 11:14:29 crc kubenswrapper[4807]: E1127 11:14:29.151520 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="648ab531-aea8-438e-9a6a-f827594e98b4" containerName="route-controller-manager" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.151536 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="648ab531-aea8-438e-9a6a-f827594e98b4" containerName="route-controller-manager" Nov 27 11:14:29 crc kubenswrapper[4807]: E1127 11:14:29.151571 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7674389a-8181-488b-bb03-d97eee98df00" containerName="installer" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.151587 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7674389a-8181-488b-bb03-d97eee98df00" containerName="installer" Nov 27 11:14:29 crc kubenswrapper[4807]: E1127 11:14:29.151619 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38d5152-81eb-46c2-9753-84286838528f" containerName="controller-manager" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.151635 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38d5152-81eb-46c2-9753-84286838528f" containerName="controller-manager" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.151817 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="648ab531-aea8-438e-9a6a-f827594e98b4" containerName="route-controller-manager" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.151848 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.151872 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38d5152-81eb-46c2-9753-84286838528f" containerName="controller-manager" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.151907 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7674389a-8181-488b-bb03-d97eee98df00" containerName="installer" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.152636 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.153514 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk"] Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.153899 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.156065 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.156256 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.156374 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.159081 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.159095 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.159706 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.169821 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk"] Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.171723 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.175316 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6649f4cd5-qrrrl"] Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.326883 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcm4k\" (UniqueName: \"kubernetes.io/projected/5ffdb516-172b-4d9b-934b-eaff80f75bc0-kube-api-access-tcm4k\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.326956 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-config\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.327020 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-547d6\" (UniqueName: \"kubernetes.io/projected/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-kube-api-access-547d6\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.327053 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-client-ca\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.327089 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-serving-cert\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.327136 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-config\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.327179 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-client-ca\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.327215 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ffdb516-172b-4d9b-934b-eaff80f75bc0-serving-cert\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.327241 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-proxy-ca-bundles\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.412992 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.429701 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-serving-cert\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.430066 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-config\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.430117 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-client-ca\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.430141 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ffdb516-172b-4d9b-934b-eaff80f75bc0-serving-cert\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.430170 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-proxy-ca-bundles\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.430205 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcm4k\" (UniqueName: \"kubernetes.io/projected/5ffdb516-172b-4d9b-934b-eaff80f75bc0-kube-api-access-tcm4k\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.430235 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-config\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.430277 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-547d6\" (UniqueName: \"kubernetes.io/projected/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-kube-api-access-547d6\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.431502 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-config\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.431542 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-client-ca\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.431634 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-proxy-ca-bundles\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.432933 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-client-ca\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.430304 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-client-ca\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.435955 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-config\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.439063 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-serving-cert\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.439813 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ffdb516-172b-4d9b-934b-eaff80f75bc0-serving-cert\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.455730 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcm4k\" (UniqueName: \"kubernetes.io/projected/5ffdb516-172b-4d9b-934b-eaff80f75bc0-kube-api-access-tcm4k\") pod \"controller-manager-6649f4cd5-qrrrl\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.456466 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z"] Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.457785 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-547d6\" (UniqueName: \"kubernetes.io/projected/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-kube-api-access-547d6\") pod \"route-controller-manager-7dfd8c449b-rtlkk\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.461849 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-sxp7z"] Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.474372 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.522760 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.547922 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="648ab531-aea8-438e-9a6a-f827594e98b4" path="/var/lib/kubelet/pods/648ab531-aea8-438e-9a6a-f827594e98b4/volumes" Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.549913 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e38d5152-81eb-46c2-9753-84286838528f" path="/var/lib/kubelet/pods/e38d5152-81eb-46c2-9753-84286838528f/volumes" Nov 27 11:14:29 crc kubenswrapper[4807]: W1127 11:14:29.733168 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ffdb516_172b_4d9b_934b_eaff80f75bc0.slice/crio-765a6b9e22bed6b71b589ff78a0c5cb4a063331daf104e3ce18dfc9b48b893a3 WatchSource:0}: Error finding container 765a6b9e22bed6b71b589ff78a0c5cb4a063331daf104e3ce18dfc9b48b893a3: Status 404 returned error can't find the container with id 765a6b9e22bed6b71b589ff78a0c5cb4a063331daf104e3ce18dfc9b48b893a3 Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.733394 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6649f4cd5-qrrrl"] Nov 27 11:14:29 crc kubenswrapper[4807]: I1127 11:14:29.786875 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk"] Nov 27 11:14:30 crc kubenswrapper[4807]: I1127 11:14:30.420459 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" event={"ID":"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a","Type":"ContainerStarted","Data":"69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2"} Nov 27 11:14:30 crc kubenswrapper[4807]: I1127 11:14:30.420534 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" event={"ID":"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a","Type":"ContainerStarted","Data":"a11a1e36cef995733b73befe3c37955c7e5610ea67d798da403f0e591bca40d4"} Nov 27 11:14:30 crc kubenswrapper[4807]: I1127 11:14:30.421882 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" event={"ID":"5ffdb516-172b-4d9b-934b-eaff80f75bc0","Type":"ContainerStarted","Data":"61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893"} Nov 27 11:14:30 crc kubenswrapper[4807]: I1127 11:14:30.421927 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" event={"ID":"5ffdb516-172b-4d9b-934b-eaff80f75bc0","Type":"ContainerStarted","Data":"765a6b9e22bed6b71b589ff78a0c5cb4a063331daf104e3ce18dfc9b48b893a3"} Nov 27 11:14:30 crc kubenswrapper[4807]: I1127 11:14:30.422112 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:30 crc kubenswrapper[4807]: I1127 11:14:30.427377 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:30 crc kubenswrapper[4807]: I1127 11:14:30.437535 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" podStartSLOduration=3.437513281 podStartE2EDuration="3.437513281s" podCreationTimestamp="2025-11-27 11:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:14:30.435521104 +0000 UTC m=+311.535019302" watchObservedRunningTime="2025-11-27 11:14:30.437513281 +0000 UTC m=+311.537011509" Nov 27 11:14:30 crc kubenswrapper[4807]: I1127 11:14:30.452824 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" podStartSLOduration=3.4527855880000002 podStartE2EDuration="3.452785588s" podCreationTimestamp="2025-11-27 11:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:14:30.45179744 +0000 UTC m=+311.551295678" watchObservedRunningTime="2025-11-27 11:14:30.452785588 +0000 UTC m=+311.552283826" Nov 27 11:14:31 crc kubenswrapper[4807]: I1127 11:14:31.429029 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:31 crc kubenswrapper[4807]: I1127 11:14:31.436760 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:31 crc kubenswrapper[4807]: I1127 11:14:31.990876 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6649f4cd5-qrrrl"] Nov 27 11:14:32 crc kubenswrapper[4807]: I1127 11:14:32.044489 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk"] Nov 27 11:14:33 crc kubenswrapper[4807]: I1127 11:14:33.439897 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" podUID="403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a" containerName="route-controller-manager" containerID="cri-o://69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2" gracePeriod=30 Nov 27 11:14:33 crc kubenswrapper[4807]: I1127 11:14:33.440036 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" podUID="5ffdb516-172b-4d9b-934b-eaff80f75bc0" containerName="controller-manager" containerID="cri-o://61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893" gracePeriod=30 Nov 27 11:14:33 crc kubenswrapper[4807]: I1127 11:14:33.854784 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:33 crc kubenswrapper[4807]: I1127 11:14:33.899310 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:33 crc kubenswrapper[4807]: I1127 11:14:33.939974 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk"] Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.004531 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-config\") pod \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.004588 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-config\") pod \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.004611 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-proxy-ca-bundles\") pod \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.004650 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcm4k\" (UniqueName: \"kubernetes.io/projected/5ffdb516-172b-4d9b-934b-eaff80f75bc0-kube-api-access-tcm4k\") pod \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.004682 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-547d6\" (UniqueName: \"kubernetes.io/projected/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-kube-api-access-547d6\") pod \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.004738 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-serving-cert\") pod \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.004765 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ffdb516-172b-4d9b-934b-eaff80f75bc0-serving-cert\") pod \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.004789 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-client-ca\") pod \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\" (UID: \"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a\") " Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.004819 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-client-ca\") pod \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\" (UID: \"5ffdb516-172b-4d9b-934b-eaff80f75bc0\") " Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.005229 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-config" (OuterVolumeSpecName: "config") pod "5ffdb516-172b-4d9b-934b-eaff80f75bc0" (UID: "5ffdb516-172b-4d9b-934b-eaff80f75bc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.005338 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-client-ca" (OuterVolumeSpecName: "client-ca") pod "5ffdb516-172b-4d9b-934b-eaff80f75bc0" (UID: "5ffdb516-172b-4d9b-934b-eaff80f75bc0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.005342 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-config" (OuterVolumeSpecName: "config") pod "403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a" (UID: "403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.005540 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-client-ca" (OuterVolumeSpecName: "client-ca") pod "403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a" (UID: "403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.005904 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5ffdb516-172b-4d9b-934b-eaff80f75bc0" (UID: "5ffdb516-172b-4d9b-934b-eaff80f75bc0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.010453 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-kube-api-access-547d6" (OuterVolumeSpecName: "kube-api-access-547d6") pod "403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a" (UID: "403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a"). InnerVolumeSpecName "kube-api-access-547d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.012286 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ffdb516-172b-4d9b-934b-eaff80f75bc0-kube-api-access-tcm4k" (OuterVolumeSpecName: "kube-api-access-tcm4k") pod "5ffdb516-172b-4d9b-934b-eaff80f75bc0" (UID: "5ffdb516-172b-4d9b-934b-eaff80f75bc0"). InnerVolumeSpecName "kube-api-access-tcm4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.012390 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ffdb516-172b-4d9b-934b-eaff80f75bc0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5ffdb516-172b-4d9b-934b-eaff80f75bc0" (UID: "5ffdb516-172b-4d9b-934b-eaff80f75bc0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.013680 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a" (UID: "403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.106386 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-547d6\" (UniqueName: \"kubernetes.io/projected/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-kube-api-access-547d6\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.106434 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.106450 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ffdb516-172b-4d9b-934b-eaff80f75bc0-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.106464 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.106478 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.106492 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.106504 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.106515 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ffdb516-172b-4d9b-934b-eaff80f75bc0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.106528 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcm4k\" (UniqueName: \"kubernetes.io/projected/5ffdb516-172b-4d9b-934b-eaff80f75bc0-kube-api-access-tcm4k\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.446832 4807 generic.go:334] "Generic (PLEG): container finished" podID="5ffdb516-172b-4d9b-934b-eaff80f75bc0" containerID="61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893" exitCode=0 Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.446877 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.446910 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" event={"ID":"5ffdb516-172b-4d9b-934b-eaff80f75bc0","Type":"ContainerDied","Data":"61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893"} Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.446937 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6649f4cd5-qrrrl" event={"ID":"5ffdb516-172b-4d9b-934b-eaff80f75bc0","Type":"ContainerDied","Data":"765a6b9e22bed6b71b589ff78a0c5cb4a063331daf104e3ce18dfc9b48b893a3"} Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.446955 4807 scope.go:117] "RemoveContainer" containerID="61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.450349 4807 generic.go:334] "Generic (PLEG): container finished" podID="403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a" containerID="69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2" exitCode=0 Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.450389 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" event={"ID":"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a","Type":"ContainerDied","Data":"69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2"} Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.450413 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" event={"ID":"403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a","Type":"ContainerDied","Data":"a11a1e36cef995733b73befe3c37955c7e5610ea67d798da403f0e591bca40d4"} Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.450740 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.463094 4807 scope.go:117] "RemoveContainer" containerID="61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893" Nov 27 11:14:34 crc kubenswrapper[4807]: E1127 11:14:34.464193 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893\": container with ID starting with 61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893 not found: ID does not exist" containerID="61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.464261 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893"} err="failed to get container status \"61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893\": rpc error: code = NotFound desc = could not find container \"61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893\": container with ID starting with 61eda4809ce6e950e8c0ca27ffcc4d9b4294758c4715049e86528339296d6893 not found: ID does not exist" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.464347 4807 scope.go:117] "RemoveContainer" containerID="69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.485348 4807 scope.go:117] "RemoveContainer" containerID="69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2" Nov 27 11:14:34 crc kubenswrapper[4807]: E1127 11:14:34.485844 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2\": container with ID starting with 69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2 not found: ID does not exist" containerID="69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.485907 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2"} err="failed to get container status \"69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2\": rpc error: code = NotFound desc = could not find container \"69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2\": container with ID starting with 69443e5a74c4fcba315a7a380c26e5ab32c6c749450d77a862542e731ba916e2 not found: ID does not exist" Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.486779 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6649f4cd5-qrrrl"] Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.491287 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6649f4cd5-qrrrl"] Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.495901 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk"] Nov 27 11:14:34 crc kubenswrapper[4807]: I1127 11:14:34.501353 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dfd8c449b-rtlkk"] Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.158275 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9"] Nov 27 11:14:35 crc kubenswrapper[4807]: E1127 11:14:35.159066 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ffdb516-172b-4d9b-934b-eaff80f75bc0" containerName="controller-manager" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.159095 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ffdb516-172b-4d9b-934b-eaff80f75bc0" containerName="controller-manager" Nov 27 11:14:35 crc kubenswrapper[4807]: E1127 11:14:35.159127 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a" containerName="route-controller-manager" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.159139 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a" containerName="route-controller-manager" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.159367 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ffdb516-172b-4d9b-934b-eaff80f75bc0" containerName="controller-manager" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.159401 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a" containerName="route-controller-manager" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.159968 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.160980 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-prrd9"] Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.161622 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.162540 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.162963 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.163611 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.165330 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.165637 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.165938 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.166497 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.166628 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.166853 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.166952 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.166998 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.167104 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.178984 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-prrd9"] Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.180433 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.186915 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9"] Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.220148 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-serving-cert\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.220198 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-client-ca\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.220239 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-proxy-ca-bundles\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.220316 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-config\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.220368 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-client-ca\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.220404 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-config\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.220580 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b26864-649d-4197-9881-44992c80f0c4-serving-cert\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.220941 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj6mp\" (UniqueName: \"kubernetes.io/projected/e4b26864-649d-4197-9881-44992c80f0c4-kube-api-access-pj6mp\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.221003 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8sm4\" (UniqueName: \"kubernetes.io/projected/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-kube-api-access-j8sm4\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.321707 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-config\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.321794 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b26864-649d-4197-9881-44992c80f0c4-serving-cert\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.321853 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj6mp\" (UniqueName: \"kubernetes.io/projected/e4b26864-649d-4197-9881-44992c80f0c4-kube-api-access-pj6mp\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.321903 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8sm4\" (UniqueName: \"kubernetes.io/projected/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-kube-api-access-j8sm4\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.321972 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-serving-cert\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.322006 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-client-ca\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.322041 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-proxy-ca-bundles\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.322092 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-config\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.322128 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-client-ca\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.323670 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-client-ca\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.323997 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-config\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.324197 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-config\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.324791 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-client-ca\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.325300 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-proxy-ca-bundles\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.328682 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b26864-649d-4197-9881-44992c80f0c4-serving-cert\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.335724 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-serving-cert\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.339500 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj6mp\" (UniqueName: \"kubernetes.io/projected/e4b26864-649d-4197-9881-44992c80f0c4-kube-api-access-pj6mp\") pod \"route-controller-manager-75664bd6d9-625d9\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.342806 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8sm4\" (UniqueName: \"kubernetes.io/projected/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-kube-api-access-j8sm4\") pod \"controller-manager-658fd5994d-prrd9\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.483844 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.495224 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.539434 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a" path="/var/lib/kubelet/pods/403aabfa-1fd7-49b2-b4a3-abcfa4c7cd3a/volumes" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.540296 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ffdb516-172b-4d9b-934b-eaff80f75bc0" path="/var/lib/kubelet/pods/5ffdb516-172b-4d9b-934b-eaff80f75bc0/volumes" Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.735565 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-prrd9"] Nov 27 11:14:35 crc kubenswrapper[4807]: W1127 11:14:35.739694 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45883014_d0c7_4ce0_ad6b_2ea83ab167ca.slice/crio-f22f46fe820d9da4fd45da2f62753988f9ec66afc9dc41cba2c1a213fc293a16 WatchSource:0}: Error finding container f22f46fe820d9da4fd45da2f62753988f9ec66afc9dc41cba2c1a213fc293a16: Status 404 returned error can't find the container with id f22f46fe820d9da4fd45da2f62753988f9ec66afc9dc41cba2c1a213fc293a16 Nov 27 11:14:35 crc kubenswrapper[4807]: I1127 11:14:35.875398 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9"] Nov 27 11:14:35 crc kubenswrapper[4807]: W1127 11:14:35.891804 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4b26864_649d_4197_9881_44992c80f0c4.slice/crio-57b1bb3fa903fc945abf9e31344078e5696fefdcb89bb60170c1a708b5bb47ef WatchSource:0}: Error finding container 57b1bb3fa903fc945abf9e31344078e5696fefdcb89bb60170c1a708b5bb47ef: Status 404 returned error can't find the container with id 57b1bb3fa903fc945abf9e31344078e5696fefdcb89bb60170c1a708b5bb47ef Nov 27 11:14:36 crc kubenswrapper[4807]: I1127 11:14:36.464507 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" event={"ID":"45883014-d0c7-4ce0-ad6b-2ea83ab167ca","Type":"ContainerStarted","Data":"b0030f4ed48a98ca8f59115d4b872121819ccd5290871c57d9ea1bc95def4887"} Nov 27 11:14:36 crc kubenswrapper[4807]: I1127 11:14:36.464814 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" event={"ID":"45883014-d0c7-4ce0-ad6b-2ea83ab167ca","Type":"ContainerStarted","Data":"f22f46fe820d9da4fd45da2f62753988f9ec66afc9dc41cba2c1a213fc293a16"} Nov 27 11:14:36 crc kubenswrapper[4807]: I1127 11:14:36.465277 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:36 crc kubenswrapper[4807]: I1127 11:14:36.466672 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" event={"ID":"e4b26864-649d-4197-9881-44992c80f0c4","Type":"ContainerStarted","Data":"f066814f5effab106ce374179eae9aca4e297bf065e22eadf1f27087290edbc2"} Nov 27 11:14:36 crc kubenswrapper[4807]: I1127 11:14:36.466709 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" event={"ID":"e4b26864-649d-4197-9881-44992c80f0c4","Type":"ContainerStarted","Data":"57b1bb3fa903fc945abf9e31344078e5696fefdcb89bb60170c1a708b5bb47ef"} Nov 27 11:14:36 crc kubenswrapper[4807]: I1127 11:14:36.466879 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:36 crc kubenswrapper[4807]: I1127 11:14:36.469844 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:14:36 crc kubenswrapper[4807]: I1127 11:14:36.470774 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:36 crc kubenswrapper[4807]: I1127 11:14:36.488351 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" podStartSLOduration=3.488332047 podStartE2EDuration="3.488332047s" podCreationTimestamp="2025-11-27 11:14:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:14:36.486440292 +0000 UTC m=+317.585938490" watchObservedRunningTime="2025-11-27 11:14:36.488332047 +0000 UTC m=+317.587830245" Nov 27 11:14:36 crc kubenswrapper[4807]: I1127 11:14:36.522340 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" podStartSLOduration=3.52231964 podStartE2EDuration="3.52231964s" podCreationTimestamp="2025-11-27 11:14:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:14:36.521721903 +0000 UTC m=+317.621220101" watchObservedRunningTime="2025-11-27 11:14:36.52231964 +0000 UTC m=+317.621817848" Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.338021 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9"] Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.338777 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" podUID="e4b26864-649d-4197-9881-44992c80f0c4" containerName="route-controller-manager" containerID="cri-o://f066814f5effab106ce374179eae9aca4e297bf065e22eadf1f27087290edbc2" gracePeriod=30 Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.525544 4807 generic.go:334] "Generic (PLEG): container finished" podID="e4b26864-649d-4197-9881-44992c80f0c4" containerID="f066814f5effab106ce374179eae9aca4e297bf065e22eadf1f27087290edbc2" exitCode=0 Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.525594 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" event={"ID":"e4b26864-649d-4197-9881-44992c80f0c4","Type":"ContainerDied","Data":"f066814f5effab106ce374179eae9aca4e297bf065e22eadf1f27087290edbc2"} Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.828771 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.969398 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj6mp\" (UniqueName: \"kubernetes.io/projected/e4b26864-649d-4197-9881-44992c80f0c4-kube-api-access-pj6mp\") pod \"e4b26864-649d-4197-9881-44992c80f0c4\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.969471 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-config\") pod \"e4b26864-649d-4197-9881-44992c80f0c4\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.969549 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-client-ca\") pod \"e4b26864-649d-4197-9881-44992c80f0c4\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.969627 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b26864-649d-4197-9881-44992c80f0c4-serving-cert\") pod \"e4b26864-649d-4197-9881-44992c80f0c4\" (UID: \"e4b26864-649d-4197-9881-44992c80f0c4\") " Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.970910 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-client-ca" (OuterVolumeSpecName: "client-ca") pod "e4b26864-649d-4197-9881-44992c80f0c4" (UID: "e4b26864-649d-4197-9881-44992c80f0c4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.970918 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-config" (OuterVolumeSpecName: "config") pod "e4b26864-649d-4197-9881-44992c80f0c4" (UID: "e4b26864-649d-4197-9881-44992c80f0c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.975230 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b26864-649d-4197-9881-44992c80f0c4-kube-api-access-pj6mp" (OuterVolumeSpecName: "kube-api-access-pj6mp") pod "e4b26864-649d-4197-9881-44992c80f0c4" (UID: "e4b26864-649d-4197-9881-44992c80f0c4"). InnerVolumeSpecName "kube-api-access-pj6mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:14:47 crc kubenswrapper[4807]: I1127 11:14:47.976472 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4b26864-649d-4197-9881-44992c80f0c4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e4b26864-649d-4197-9881-44992c80f0c4" (UID: "e4b26864-649d-4197-9881-44992c80f0c4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:48 crc kubenswrapper[4807]: I1127 11:14:48.070651 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:48 crc kubenswrapper[4807]: I1127 11:14:48.070683 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4b26864-649d-4197-9881-44992c80f0c4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:48 crc kubenswrapper[4807]: I1127 11:14:48.070697 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj6mp\" (UniqueName: \"kubernetes.io/projected/e4b26864-649d-4197-9881-44992c80f0c4-kube-api-access-pj6mp\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:48 crc kubenswrapper[4807]: I1127 11:14:48.070710 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4b26864-649d-4197-9881-44992c80f0c4-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:48 crc kubenswrapper[4807]: I1127 11:14:48.533516 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" event={"ID":"e4b26864-649d-4197-9881-44992c80f0c4","Type":"ContainerDied","Data":"57b1bb3fa903fc945abf9e31344078e5696fefdcb89bb60170c1a708b5bb47ef"} Nov 27 11:14:48 crc kubenswrapper[4807]: I1127 11:14:48.533593 4807 scope.go:117] "RemoveContainer" containerID="f066814f5effab106ce374179eae9aca4e297bf065e22eadf1f27087290edbc2" Nov 27 11:14:48 crc kubenswrapper[4807]: I1127 11:14:48.534963 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9" Nov 27 11:14:48 crc kubenswrapper[4807]: I1127 11:14:48.580570 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9"] Nov 27 11:14:48 crc kubenswrapper[4807]: I1127 11:14:48.583752 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75664bd6d9-625d9"] Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.167529 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5"] Nov 27 11:14:49 crc kubenswrapper[4807]: E1127 11:14:49.167832 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b26864-649d-4197-9881-44992c80f0c4" containerName="route-controller-manager" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.167854 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b26864-649d-4197-9881-44992c80f0c4" containerName="route-controller-manager" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.168083 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b26864-649d-4197-9881-44992c80f0c4" containerName="route-controller-manager" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.168717 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.170468 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.171790 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.171935 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.172120 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.173124 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.173387 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.209029 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c56d176-9b33-4c04-ad4e-b476b5681d2e-client-ca\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.209075 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g7lk\" (UniqueName: \"kubernetes.io/projected/5c56d176-9b33-4c04-ad4e-b476b5681d2e-kube-api-access-2g7lk\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.209110 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c56d176-9b33-4c04-ad4e-b476b5681d2e-config\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.209168 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c56d176-9b33-4c04-ad4e-b476b5681d2e-serving-cert\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.212921 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5"] Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.309814 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c56d176-9b33-4c04-ad4e-b476b5681d2e-serving-cert\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.310098 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c56d176-9b33-4c04-ad4e-b476b5681d2e-client-ca\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.310139 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7lk\" (UniqueName: \"kubernetes.io/projected/5c56d176-9b33-4c04-ad4e-b476b5681d2e-kube-api-access-2g7lk\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.310173 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c56d176-9b33-4c04-ad4e-b476b5681d2e-config\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.311341 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5c56d176-9b33-4c04-ad4e-b476b5681d2e-client-ca\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.311447 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c56d176-9b33-4c04-ad4e-b476b5681d2e-config\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.315653 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c56d176-9b33-4c04-ad4e-b476b5681d2e-serving-cert\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.340626 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7lk\" (UniqueName: \"kubernetes.io/projected/5c56d176-9b33-4c04-ad4e-b476b5681d2e-kube-api-access-2g7lk\") pod \"route-controller-manager-684cd4d5c4-zkxn5\" (UID: \"5c56d176-9b33-4c04-ad4e-b476b5681d2e\") " pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.519364 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.542233 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4b26864-649d-4197-9881-44992c80f0c4" path="/var/lib/kubelet/pods/e4b26864-649d-4197-9881-44992c80f0c4/volumes" Nov 27 11:14:49 crc kubenswrapper[4807]: I1127 11:14:49.928667 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5"] Nov 27 11:14:50 crc kubenswrapper[4807]: I1127 11:14:50.551790 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" event={"ID":"5c56d176-9b33-4c04-ad4e-b476b5681d2e","Type":"ContainerStarted","Data":"7afd37cc732d5a27ab36fe6851d621e7d039501b76d836be179fa9143c9221ae"} Nov 27 11:14:50 crc kubenswrapper[4807]: I1127 11:14:50.552061 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" event={"ID":"5c56d176-9b33-4c04-ad4e-b476b5681d2e","Type":"ContainerStarted","Data":"d88067930dc620dcd8cd290b61ad123583d89550efd2610f60b146ede2b85f02"} Nov 27 11:14:50 crc kubenswrapper[4807]: I1127 11:14:50.552079 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:50 crc kubenswrapper[4807]: I1127 11:14:50.559918 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" Nov 27 11:14:50 crc kubenswrapper[4807]: I1127 11:14:50.577087 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-684cd4d5c4-zkxn5" podStartSLOduration=3.577070578 podStartE2EDuration="3.577070578s" podCreationTimestamp="2025-11-27 11:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:14:50.574335421 +0000 UTC m=+331.673833629" watchObservedRunningTime="2025-11-27 11:14:50.577070578 +0000 UTC m=+331.676568776" Nov 27 11:14:58 crc kubenswrapper[4807]: I1127 11:14:58.972830 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" podUID="0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" containerName="oauth-openshift" containerID="cri-o://d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea" gracePeriod=15 Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.563438 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.598657 4807 generic.go:334] "Generic (PLEG): container finished" podID="0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" containerID="d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea" exitCode=0 Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.598707 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" event={"ID":"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f","Type":"ContainerDied","Data":"d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea"} Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.598738 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" event={"ID":"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f","Type":"ContainerDied","Data":"207fae96195cf5d8aad19742abd0920d7438967050e106ea398030953d46dea1"} Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.598758 4807 scope.go:117] "RemoveContainer" containerID="d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.598780 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.607637 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-57bd9f8449-x6wvd"] Nov 27 11:14:59 crc kubenswrapper[4807]: E1127 11:14:59.607908 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" containerName="oauth-openshift" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.607930 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" containerName="oauth-openshift" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.608080 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" containerName="oauth-openshift" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.608568 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.623069 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57bd9f8449-x6wvd"] Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.632831 4807 scope.go:117] "RemoveContainer" containerID="d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea" Nov 27 11:14:59 crc kubenswrapper[4807]: E1127 11:14:59.634950 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea\": container with ID starting with d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea not found: ID does not exist" containerID="d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.634983 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea"} err="failed to get container status \"d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea\": rpc error: code = NotFound desc = could not find container \"d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea\": container with ID starting with d41d8759ec666e46c9dccf04fa78c6be89bfe268b1780229d9624921841f99ea not found: ID does not exist" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639324 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-provider-selection\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639368 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-router-certs\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639398 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-policies\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639429 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-error\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639468 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-serving-cert\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639488 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-trusted-ca-bundle\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639510 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk7bf\" (UniqueName: \"kubernetes.io/projected/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-kube-api-access-lk7bf\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639538 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-ocp-branding-template\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639560 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-session\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639584 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-login\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639607 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-cliconfig\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639631 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-dir\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639654 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-service-ca\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639686 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-idp-0-file-data\") pod \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\" (UID: \"0d6f0746-0c61-45ef-94d0-f3d4bb789f1f\") " Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639773 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hgrm\" (UniqueName: \"kubernetes.io/projected/3b88ef47-c80e-4490-b4fc-02f3b3d27957-kube-api-access-6hgrm\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639805 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639828 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639865 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-template-error\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639886 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639905 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639936 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-audit-policies\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639958 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-session\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.639978 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.640001 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.640025 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.640063 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b88ef47-c80e-4490-b4fc-02f3b3d27957-audit-dir\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.640090 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-template-login\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.640112 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.641021 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.641883 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.642372 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.644224 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.644522 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.646373 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-kube-api-access-lk7bf" (OuterVolumeSpecName: "kube-api-access-lk7bf") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "kube-api-access-lk7bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.647382 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.647701 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.647842 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.648665 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.648821 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.649781 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.650948 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.656033 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" (UID: "0d6f0746-0c61-45ef-94d0-f3d4bb789f1f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741456 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b88ef47-c80e-4490-b4fc-02f3b3d27957-audit-dir\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741514 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-template-login\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741540 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741564 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hgrm\" (UniqueName: \"kubernetes.io/projected/3b88ef47-c80e-4490-b4fc-02f3b3d27957-kube-api-access-6hgrm\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741592 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741616 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741659 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-template-error\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741684 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741691 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3b88ef47-c80e-4490-b4fc-02f3b3d27957-audit-dir\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741709 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.741876 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-audit-policies\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742146 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-session\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742189 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742229 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742304 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742452 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742480 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742499 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742521 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk7bf\" (UniqueName: \"kubernetes.io/projected/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-kube-api-access-lk7bf\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742541 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742562 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742580 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742597 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742616 4807 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742636 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742654 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742674 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742692 4807 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742711 4807 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.742763 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.743045 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-audit-policies\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.743419 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-service-ca\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.744008 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.745208 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-session\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.745335 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.745447 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.745810 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.746188 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-template-error\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.746462 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-system-router-certs\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.746578 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.749014 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3b88ef47-c80e-4490-b4fc-02f3b3d27957-v4-0-config-user-template-login\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.762053 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hgrm\" (UniqueName: \"kubernetes.io/projected/3b88ef47-c80e-4490-b4fc-02f3b3d27957-kube-api-access-6hgrm\") pod \"oauth-openshift-57bd9f8449-x6wvd\" (UID: \"3b88ef47-c80e-4490-b4fc-02f3b3d27957\") " pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.929724 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk"] Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.933933 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk"] Nov 27 11:14:59 crc kubenswrapper[4807]: I1127 11:14:59.940842 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.169644 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch"] Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.170207 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.173563 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.173611 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.188786 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch"] Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.248755 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbnbx\" (UniqueName: \"kubernetes.io/projected/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-kube-api-access-jbnbx\") pod \"collect-profiles-29404035-l9jch\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.248896 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-secret-volume\") pod \"collect-profiles-29404035-l9jch\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.248934 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-config-volume\") pod \"collect-profiles-29404035-l9jch\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.336663 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57bd9f8449-x6wvd"] Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.352207 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-secret-volume\") pod \"collect-profiles-29404035-l9jch\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.352274 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-config-volume\") pod \"collect-profiles-29404035-l9jch\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.352314 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbnbx\" (UniqueName: \"kubernetes.io/projected/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-kube-api-access-jbnbx\") pod \"collect-profiles-29404035-l9jch\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.353057 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-config-volume\") pod \"collect-profiles-29404035-l9jch\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.356915 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-secret-volume\") pod \"collect-profiles-29404035-l9jch\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.371678 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbnbx\" (UniqueName: \"kubernetes.io/projected/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-kube-api-access-jbnbx\") pod \"collect-profiles-29404035-l9jch\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.419531 4807 patch_prober.go:28] interesting pod/oauth-openshift-6c8d5d4f46-t6sqk container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.419599 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6c8d5d4f46-t6sqk" podUID="0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.489045 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.620397 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" event={"ID":"3b88ef47-c80e-4490-b4fc-02f3b3d27957","Type":"ContainerStarted","Data":"e6d681638d073facf14071aa677cf83b806d88f5a006cc804b1c6ee15d15d78d"} Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.620740 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" event={"ID":"3b88ef47-c80e-4490-b4fc-02f3b3d27957","Type":"ContainerStarted","Data":"cdef3a865f929cd8d315958bbe4880751cf8aa756edb86d358a136436ae39b10"} Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.620763 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.623851 4807 patch_prober.go:28] interesting pod/oauth-openshift-57bd9f8449-x6wvd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" start-of-body= Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.623898 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" podUID="3b88ef47-c80e-4490-b4fc-02f3b3d27957" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.639547 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" podStartSLOduration=27.639530361 podStartE2EDuration="27.639530361s" podCreationTimestamp="2025-11-27 11:14:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:15:00.63702964 +0000 UTC m=+341.736527838" watchObservedRunningTime="2025-11-27 11:15:00.639530361 +0000 UTC m=+341.739028559" Nov 27 11:15:00 crc kubenswrapper[4807]: I1127 11:15:00.877163 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch"] Nov 27 11:15:00 crc kubenswrapper[4807]: W1127 11:15:00.881362 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81fc3365_a2d9_4adb_a884_97c8c4b6d3f2.slice/crio-0fee1be0fd15b227469b068ec0bccd5df19e4d62b62790efcc2885cc33cd26ab WatchSource:0}: Error finding container 0fee1be0fd15b227469b068ec0bccd5df19e4d62b62790efcc2885cc33cd26ab: Status 404 returned error can't find the container with id 0fee1be0fd15b227469b068ec0bccd5df19e4d62b62790efcc2885cc33cd26ab Nov 27 11:15:01 crc kubenswrapper[4807]: I1127 11:15:01.539657 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d6f0746-0c61-45ef-94d0-f3d4bb789f1f" path="/var/lib/kubelet/pods/0d6f0746-0c61-45ef-94d0-f3d4bb789f1f/volumes" Nov 27 11:15:01 crc kubenswrapper[4807]: I1127 11:15:01.626853 4807 generic.go:334] "Generic (PLEG): container finished" podID="81fc3365-a2d9-4adb-a884-97c8c4b6d3f2" containerID="5f57af6322a1d584b36a0620f6ed4ebfe5b176473ea50ee276e63b051d449ebc" exitCode=0 Nov 27 11:15:01 crc kubenswrapper[4807]: I1127 11:15:01.626946 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" event={"ID":"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2","Type":"ContainerDied","Data":"5f57af6322a1d584b36a0620f6ed4ebfe5b176473ea50ee276e63b051d449ebc"} Nov 27 11:15:01 crc kubenswrapper[4807]: I1127 11:15:01.627199 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" event={"ID":"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2","Type":"ContainerStarted","Data":"0fee1be0fd15b227469b068ec0bccd5df19e4d62b62790efcc2885cc33cd26ab"} Nov 27 11:15:01 crc kubenswrapper[4807]: I1127 11:15:01.631850 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-57bd9f8449-x6wvd" Nov 27 11:15:02 crc kubenswrapper[4807]: I1127 11:15:02.984415 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.081818 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-secret-volume\") pod \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.081863 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-config-volume\") pod \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.081894 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbnbx\" (UniqueName: \"kubernetes.io/projected/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-kube-api-access-jbnbx\") pod \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\" (UID: \"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2\") " Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.083537 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-config-volume" (OuterVolumeSpecName: "config-volume") pod "81fc3365-a2d9-4adb-a884-97c8c4b6d3f2" (UID: "81fc3365-a2d9-4adb-a884-97c8c4b6d3f2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.087280 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "81fc3365-a2d9-4adb-a884-97c8c4b6d3f2" (UID: "81fc3365-a2d9-4adb-a884-97c8c4b6d3f2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.087458 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-kube-api-access-jbnbx" (OuterVolumeSpecName: "kube-api-access-jbnbx") pod "81fc3365-a2d9-4adb-a884-97c8c4b6d3f2" (UID: "81fc3365-a2d9-4adb-a884-97c8c4b6d3f2"). InnerVolumeSpecName "kube-api-access-jbnbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.183600 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.183633 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.183643 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbnbx\" (UniqueName: \"kubernetes.io/projected/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2-kube-api-access-jbnbx\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.638753 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" event={"ID":"81fc3365-a2d9-4adb-a884-97c8c4b6d3f2","Type":"ContainerDied","Data":"0fee1be0fd15b227469b068ec0bccd5df19e4d62b62790efcc2885cc33cd26ab"} Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.638789 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fee1be0fd15b227469b068ec0bccd5df19e4d62b62790efcc2885cc33cd26ab" Nov 27 11:15:03 crc kubenswrapper[4807]: I1127 11:15:03.638851 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.219875 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5242m"] Nov 27 11:15:08 crc kubenswrapper[4807]: E1127 11:15:08.222625 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81fc3365-a2d9-4adb-a884-97c8c4b6d3f2" containerName="collect-profiles" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.222757 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="81fc3365-a2d9-4adb-a884-97c8c4b6d3f2" containerName="collect-profiles" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.222976 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="81fc3365-a2d9-4adb-a884-97c8c4b6d3f2" containerName="collect-profiles" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.223771 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.240335 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5242m"] Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.349141 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9deb16ed-1b63-481f-b694-73bc599f987c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.349291 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qfxp\" (UniqueName: \"kubernetes.io/projected/9deb16ed-1b63-481f-b694-73bc599f987c-kube-api-access-9qfxp\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.349575 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9deb16ed-1b63-481f-b694-73bc599f987c-registry-certificates\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.349851 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9deb16ed-1b63-481f-b694-73bc599f987c-bound-sa-token\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.349998 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9deb16ed-1b63-481f-b694-73bc599f987c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.350126 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9deb16ed-1b63-481f-b694-73bc599f987c-trusted-ca\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.350185 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9deb16ed-1b63-481f-b694-73bc599f987c-registry-tls\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.350452 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.388733 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.452291 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qfxp\" (UniqueName: \"kubernetes.io/projected/9deb16ed-1b63-481f-b694-73bc599f987c-kube-api-access-9qfxp\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.452369 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9deb16ed-1b63-481f-b694-73bc599f987c-registry-certificates\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.452422 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9deb16ed-1b63-481f-b694-73bc599f987c-bound-sa-token\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.452452 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9deb16ed-1b63-481f-b694-73bc599f987c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.452481 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9deb16ed-1b63-481f-b694-73bc599f987c-trusted-ca\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.452503 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9deb16ed-1b63-481f-b694-73bc599f987c-registry-tls\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.452547 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9deb16ed-1b63-481f-b694-73bc599f987c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.453105 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9deb16ed-1b63-481f-b694-73bc599f987c-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.453729 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9deb16ed-1b63-481f-b694-73bc599f987c-registry-certificates\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.454762 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9deb16ed-1b63-481f-b694-73bc599f987c-trusted-ca\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.460750 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9deb16ed-1b63-481f-b694-73bc599f987c-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.461182 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9deb16ed-1b63-481f-b694-73bc599f987c-registry-tls\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.468994 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9deb16ed-1b63-481f-b694-73bc599f987c-bound-sa-token\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.474396 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qfxp\" (UniqueName: \"kubernetes.io/projected/9deb16ed-1b63-481f-b694-73bc599f987c-kube-api-access-9qfxp\") pod \"image-registry-66df7c8f76-5242m\" (UID: \"9deb16ed-1b63-481f-b694-73bc599f987c\") " pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.546176 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:08 crc kubenswrapper[4807]: I1127 11:15:08.975507 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5242m"] Nov 27 11:15:09 crc kubenswrapper[4807]: I1127 11:15:09.688856 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5242m" event={"ID":"9deb16ed-1b63-481f-b694-73bc599f987c","Type":"ContainerStarted","Data":"fa6ca388c8f85077bd9da409b773c20edafc67f8e84259574551decd116b3192"} Nov 27 11:15:09 crc kubenswrapper[4807]: I1127 11:15:09.689357 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5242m" event={"ID":"9deb16ed-1b63-481f-b694-73bc599f987c","Type":"ContainerStarted","Data":"279b5cfe966ab2803aa7f4003a50435e3306be49e532bebe54914228dce84512"} Nov 27 11:15:09 crc kubenswrapper[4807]: I1127 11:15:09.689398 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:09 crc kubenswrapper[4807]: I1127 11:15:09.714505 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-5242m" podStartSLOduration=1.714483536 podStartE2EDuration="1.714483536s" podCreationTimestamp="2025-11-27 11:15:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:15:09.712483979 +0000 UTC m=+350.811982187" watchObservedRunningTime="2025-11-27 11:15:09.714483536 +0000 UTC m=+350.813981734" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.393862 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4bvrf"] Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.395776 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.397935 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.406789 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bvrf"] Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.481849 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7tcm\" (UniqueName: \"kubernetes.io/projected/2b1d58e4-b0b5-4213-a286-5237808fe138-kube-api-access-b7tcm\") pod \"redhat-marketplace-4bvrf\" (UID: \"2b1d58e4-b0b5-4213-a286-5237808fe138\") " pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.481917 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d58e4-b0b5-4213-a286-5237808fe138-utilities\") pod \"redhat-marketplace-4bvrf\" (UID: \"2b1d58e4-b0b5-4213-a286-5237808fe138\") " pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.481979 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d58e4-b0b5-4213-a286-5237808fe138-catalog-content\") pod \"redhat-marketplace-4bvrf\" (UID: \"2b1d58e4-b0b5-4213-a286-5237808fe138\") " pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.582889 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7tcm\" (UniqueName: \"kubernetes.io/projected/2b1d58e4-b0b5-4213-a286-5237808fe138-kube-api-access-b7tcm\") pod \"redhat-marketplace-4bvrf\" (UID: \"2b1d58e4-b0b5-4213-a286-5237808fe138\") " pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.582971 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d58e4-b0b5-4213-a286-5237808fe138-utilities\") pod \"redhat-marketplace-4bvrf\" (UID: \"2b1d58e4-b0b5-4213-a286-5237808fe138\") " pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.583028 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d58e4-b0b5-4213-a286-5237808fe138-catalog-content\") pod \"redhat-marketplace-4bvrf\" (UID: \"2b1d58e4-b0b5-4213-a286-5237808fe138\") " pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.583565 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b1d58e4-b0b5-4213-a286-5237808fe138-catalog-content\") pod \"redhat-marketplace-4bvrf\" (UID: \"2b1d58e4-b0b5-4213-a286-5237808fe138\") " pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.583652 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b1d58e4-b0b5-4213-a286-5237808fe138-utilities\") pod \"redhat-marketplace-4bvrf\" (UID: \"2b1d58e4-b0b5-4213-a286-5237808fe138\") " pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.585205 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hmsvr"] Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.587005 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.590111 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.601200 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmsvr"] Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.621187 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7tcm\" (UniqueName: \"kubernetes.io/projected/2b1d58e4-b0b5-4213-a286-5237808fe138-kube-api-access-b7tcm\") pod \"redhat-marketplace-4bvrf\" (UID: \"2b1d58e4-b0b5-4213-a286-5237808fe138\") " pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.684739 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6063b43-7eea-4c80-aa97-a65aa6790390-utilities\") pod \"redhat-operators-hmsvr\" (UID: \"e6063b43-7eea-4c80-aa97-a65aa6790390\") " pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.684879 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wbt\" (UniqueName: \"kubernetes.io/projected/e6063b43-7eea-4c80-aa97-a65aa6790390-kube-api-access-s2wbt\") pod \"redhat-operators-hmsvr\" (UID: \"e6063b43-7eea-4c80-aa97-a65aa6790390\") " pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.684967 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6063b43-7eea-4c80-aa97-a65aa6790390-catalog-content\") pod \"redhat-operators-hmsvr\" (UID: \"e6063b43-7eea-4c80-aa97-a65aa6790390\") " pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.723957 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.791975 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2wbt\" (UniqueName: \"kubernetes.io/projected/e6063b43-7eea-4c80-aa97-a65aa6790390-kube-api-access-s2wbt\") pod \"redhat-operators-hmsvr\" (UID: \"e6063b43-7eea-4c80-aa97-a65aa6790390\") " pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.793860 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6063b43-7eea-4c80-aa97-a65aa6790390-catalog-content\") pod \"redhat-operators-hmsvr\" (UID: \"e6063b43-7eea-4c80-aa97-a65aa6790390\") " pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.793965 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6063b43-7eea-4c80-aa97-a65aa6790390-utilities\") pod \"redhat-operators-hmsvr\" (UID: \"e6063b43-7eea-4c80-aa97-a65aa6790390\") " pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.795695 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6063b43-7eea-4c80-aa97-a65aa6790390-utilities\") pod \"redhat-operators-hmsvr\" (UID: \"e6063b43-7eea-4c80-aa97-a65aa6790390\") " pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.796030 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6063b43-7eea-4c80-aa97-a65aa6790390-catalog-content\") pod \"redhat-operators-hmsvr\" (UID: \"e6063b43-7eea-4c80-aa97-a65aa6790390\") " pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.819321 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2wbt\" (UniqueName: \"kubernetes.io/projected/e6063b43-7eea-4c80-aa97-a65aa6790390-kube-api-access-s2wbt\") pod \"redhat-operators-hmsvr\" (UID: \"e6063b43-7eea-4c80-aa97-a65aa6790390\") " pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:16 crc kubenswrapper[4807]: I1127 11:15:16.912340 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:17 crc kubenswrapper[4807]: I1127 11:15:17.099633 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hmsvr"] Nov 27 11:15:17 crc kubenswrapper[4807]: W1127 11:15:17.101857 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6063b43_7eea_4c80_aa97_a65aa6790390.slice/crio-d7d455d06734d4c1c2361e35b41da5803d46f6e2faca9f82792935155319f0fd WatchSource:0}: Error finding container d7d455d06734d4c1c2361e35b41da5803d46f6e2faca9f82792935155319f0fd: Status 404 returned error can't find the container with id d7d455d06734d4c1c2361e35b41da5803d46f6e2faca9f82792935155319f0fd Nov 27 11:15:17 crc kubenswrapper[4807]: I1127 11:15:17.195037 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4bvrf"] Nov 27 11:15:17 crc kubenswrapper[4807]: W1127 11:15:17.200014 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b1d58e4_b0b5_4213_a286_5237808fe138.slice/crio-572e2677d9d4702497eeff680c057fc2d6e998e21bdc334de92948de12fcc6fe WatchSource:0}: Error finding container 572e2677d9d4702497eeff680c057fc2d6e998e21bdc334de92948de12fcc6fe: Status 404 returned error can't find the container with id 572e2677d9d4702497eeff680c057fc2d6e998e21bdc334de92948de12fcc6fe Nov 27 11:15:17 crc kubenswrapper[4807]: I1127 11:15:17.741728 4807 generic.go:334] "Generic (PLEG): container finished" podID="2b1d58e4-b0b5-4213-a286-5237808fe138" containerID="206c981a311275d250070915ffbc6d1729bce6f1cfe601721c15ed20f31cb319" exitCode=0 Nov 27 11:15:17 crc kubenswrapper[4807]: I1127 11:15:17.741789 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bvrf" event={"ID":"2b1d58e4-b0b5-4213-a286-5237808fe138","Type":"ContainerDied","Data":"206c981a311275d250070915ffbc6d1729bce6f1cfe601721c15ed20f31cb319"} Nov 27 11:15:17 crc kubenswrapper[4807]: I1127 11:15:17.741866 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bvrf" event={"ID":"2b1d58e4-b0b5-4213-a286-5237808fe138","Type":"ContainerStarted","Data":"572e2677d9d4702497eeff680c057fc2d6e998e21bdc334de92948de12fcc6fe"} Nov 27 11:15:17 crc kubenswrapper[4807]: I1127 11:15:17.744545 4807 generic.go:334] "Generic (PLEG): container finished" podID="e6063b43-7eea-4c80-aa97-a65aa6790390" containerID="21e9628d3e22cab45974e886c617a335a3b0d6e542cf11a825c5c547365ea951" exitCode=0 Nov 27 11:15:17 crc kubenswrapper[4807]: I1127 11:15:17.744647 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmsvr" event={"ID":"e6063b43-7eea-4c80-aa97-a65aa6790390","Type":"ContainerDied","Data":"21e9628d3e22cab45974e886c617a335a3b0d6e542cf11a825c5c547365ea951"} Nov 27 11:15:17 crc kubenswrapper[4807]: I1127 11:15:17.744861 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmsvr" event={"ID":"e6063b43-7eea-4c80-aa97-a65aa6790390","Type":"ContainerStarted","Data":"d7d455d06734d4c1c2361e35b41da5803d46f6e2faca9f82792935155319f0fd"} Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.751273 4807 generic.go:334] "Generic (PLEG): container finished" podID="2b1d58e4-b0b5-4213-a286-5237808fe138" containerID="c0d792564a67f04512aba629bb3b42dfe32c0bf44e7ba99b16d057a18bdb18d7" exitCode=0 Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.751335 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bvrf" event={"ID":"2b1d58e4-b0b5-4213-a286-5237808fe138","Type":"ContainerDied","Data":"c0d792564a67f04512aba629bb3b42dfe32c0bf44e7ba99b16d057a18bdb18d7"} Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.754328 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmsvr" event={"ID":"e6063b43-7eea-4c80-aa97-a65aa6790390","Type":"ContainerStarted","Data":"2905c1d423c169e8eb5b9b5bda5c389678ae26c1f885e33724a6da7eab96ec56"} Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.783925 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j2bls"] Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.784898 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.786650 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.797237 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j2bls"] Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.934986 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldllc\" (UniqueName: \"kubernetes.io/projected/caec2dd0-c63a-4572-9511-5e5b3be487fb-kube-api-access-ldllc\") pod \"community-operators-j2bls\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.935049 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-utilities\") pod \"community-operators-j2bls\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.935102 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-catalog-content\") pod \"community-operators-j2bls\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.982410 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hs4rb"] Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.983392 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.985429 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 27 11:15:18 crc kubenswrapper[4807]: I1127 11:15:18.990361 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hs4rb"] Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.036049 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldllc\" (UniqueName: \"kubernetes.io/projected/caec2dd0-c63a-4572-9511-5e5b3be487fb-kube-api-access-ldllc\") pod \"community-operators-j2bls\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.036098 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-utilities\") pod \"community-operators-j2bls\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.036117 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-catalog-content\") pod \"community-operators-j2bls\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.036517 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-catalog-content\") pod \"community-operators-j2bls\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.037158 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-utilities\") pod \"community-operators-j2bls\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.055182 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldllc\" (UniqueName: \"kubernetes.io/projected/caec2dd0-c63a-4572-9511-5e5b3be487fb-kube-api-access-ldllc\") pod \"community-operators-j2bls\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.105704 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.138517 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-857xq\" (UniqueName: \"kubernetes.io/projected/fc334283-2738-4bcb-9ad6-b0654ceb5032-kube-api-access-857xq\") pod \"certified-operators-hs4rb\" (UID: \"fc334283-2738-4bcb-9ad6-b0654ceb5032\") " pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.138687 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc334283-2738-4bcb-9ad6-b0654ceb5032-utilities\") pod \"certified-operators-hs4rb\" (UID: \"fc334283-2738-4bcb-9ad6-b0654ceb5032\") " pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.138754 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc334283-2738-4bcb-9ad6-b0654ceb5032-catalog-content\") pod \"certified-operators-hs4rb\" (UID: \"fc334283-2738-4bcb-9ad6-b0654ceb5032\") " pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.240134 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc334283-2738-4bcb-9ad6-b0654ceb5032-utilities\") pod \"certified-operators-hs4rb\" (UID: \"fc334283-2738-4bcb-9ad6-b0654ceb5032\") " pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.240205 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc334283-2738-4bcb-9ad6-b0654ceb5032-catalog-content\") pod \"certified-operators-hs4rb\" (UID: \"fc334283-2738-4bcb-9ad6-b0654ceb5032\") " pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.240323 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-857xq\" (UniqueName: \"kubernetes.io/projected/fc334283-2738-4bcb-9ad6-b0654ceb5032-kube-api-access-857xq\") pod \"certified-operators-hs4rb\" (UID: \"fc334283-2738-4bcb-9ad6-b0654ceb5032\") " pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.240882 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc334283-2738-4bcb-9ad6-b0654ceb5032-utilities\") pod \"certified-operators-hs4rb\" (UID: \"fc334283-2738-4bcb-9ad6-b0654ceb5032\") " pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.241515 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc334283-2738-4bcb-9ad6-b0654ceb5032-catalog-content\") pod \"certified-operators-hs4rb\" (UID: \"fc334283-2738-4bcb-9ad6-b0654ceb5032\") " pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.264961 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-857xq\" (UniqueName: \"kubernetes.io/projected/fc334283-2738-4bcb-9ad6-b0654ceb5032-kube-api-access-857xq\") pod \"certified-operators-hs4rb\" (UID: \"fc334283-2738-4bcb-9ad6-b0654ceb5032\") " pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.298708 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.524909 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j2bls"] Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.767049 4807 generic.go:334] "Generic (PLEG): container finished" podID="e6063b43-7eea-4c80-aa97-a65aa6790390" containerID="2905c1d423c169e8eb5b9b5bda5c389678ae26c1f885e33724a6da7eab96ec56" exitCode=0 Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.767143 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmsvr" event={"ID":"e6063b43-7eea-4c80-aa97-a65aa6790390","Type":"ContainerDied","Data":"2905c1d423c169e8eb5b9b5bda5c389678ae26c1f885e33724a6da7eab96ec56"} Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.768808 4807 generic.go:334] "Generic (PLEG): container finished" podID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerID="2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a" exitCode=0 Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.768968 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2bls" event={"ID":"caec2dd0-c63a-4572-9511-5e5b3be487fb","Type":"ContainerDied","Data":"2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a"} Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.769001 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2bls" event={"ID":"caec2dd0-c63a-4572-9511-5e5b3be487fb","Type":"ContainerStarted","Data":"63eab6f0670377da29f765674e9ce16434a23d908068217cf62dd36b36eadebe"} Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.774443 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4bvrf" event={"ID":"2b1d58e4-b0b5-4213-a286-5237808fe138","Type":"ContainerStarted","Data":"13e42ac98558a534cf7f47f686a1d1560cbea79e3ddc540de1d602ec3014f1ee"} Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.807102 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hs4rb"] Nov 27 11:15:19 crc kubenswrapper[4807]: I1127 11:15:19.837672 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4bvrf" podStartSLOduration=2.164007072 podStartE2EDuration="3.837654753s" podCreationTimestamp="2025-11-27 11:15:16 +0000 UTC" firstStartedPulling="2025-11-27 11:15:17.744500962 +0000 UTC m=+358.843999160" lastFinishedPulling="2025-11-27 11:15:19.418148623 +0000 UTC m=+360.517646841" observedRunningTime="2025-11-27 11:15:19.834966237 +0000 UTC m=+360.934464435" watchObservedRunningTime="2025-11-27 11:15:19.837654753 +0000 UTC m=+360.937152951" Nov 27 11:15:20 crc kubenswrapper[4807]: I1127 11:15:20.781055 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2bls" event={"ID":"caec2dd0-c63a-4572-9511-5e5b3be487fb","Type":"ContainerStarted","Data":"bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e"} Nov 27 11:15:20 crc kubenswrapper[4807]: I1127 11:15:20.782368 4807 generic.go:334] "Generic (PLEG): container finished" podID="fc334283-2738-4bcb-9ad6-b0654ceb5032" containerID="95b197d61e3e11d039ed38dd09f1e16d122d8154737fb51419d5cb684ef66a6c" exitCode=0 Nov 27 11:15:20 crc kubenswrapper[4807]: I1127 11:15:20.782432 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs4rb" event={"ID":"fc334283-2738-4bcb-9ad6-b0654ceb5032","Type":"ContainerDied","Data":"95b197d61e3e11d039ed38dd09f1e16d122d8154737fb51419d5cb684ef66a6c"} Nov 27 11:15:20 crc kubenswrapper[4807]: I1127 11:15:20.782453 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs4rb" event={"ID":"fc334283-2738-4bcb-9ad6-b0654ceb5032","Type":"ContainerStarted","Data":"120ddad0fccb2deba5f5a10816eace5ec33fe27cf1407338d2db50187e6e6675"} Nov 27 11:15:20 crc kubenswrapper[4807]: I1127 11:15:20.784735 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hmsvr" event={"ID":"e6063b43-7eea-4c80-aa97-a65aa6790390","Type":"ContainerStarted","Data":"913010cab592cedff897778a8248ec25e11cd9ec54cb072c36e8d46bdd451678"} Nov 27 11:15:20 crc kubenswrapper[4807]: I1127 11:15:20.837530 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hmsvr" podStartSLOduration=2.112550169 podStartE2EDuration="4.837513709s" podCreationTimestamp="2025-11-27 11:15:16 +0000 UTC" firstStartedPulling="2025-11-27 11:15:17.747402204 +0000 UTC m=+358.846900402" lastFinishedPulling="2025-11-27 11:15:20.472365734 +0000 UTC m=+361.571863942" observedRunningTime="2025-11-27 11:15:20.835022059 +0000 UTC m=+361.934520247" watchObservedRunningTime="2025-11-27 11:15:20.837513709 +0000 UTC m=+361.937011907" Nov 27 11:15:20 crc kubenswrapper[4807]: I1127 11:15:20.922138 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:15:20 crc kubenswrapper[4807]: I1127 11:15:20.922216 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:15:21 crc kubenswrapper[4807]: I1127 11:15:21.793860 4807 generic.go:334] "Generic (PLEG): container finished" podID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerID="bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e" exitCode=0 Nov 27 11:15:21 crc kubenswrapper[4807]: I1127 11:15:21.793966 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2bls" event={"ID":"caec2dd0-c63a-4572-9511-5e5b3be487fb","Type":"ContainerDied","Data":"bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e"} Nov 27 11:15:22 crc kubenswrapper[4807]: I1127 11:15:22.804181 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2bls" event={"ID":"caec2dd0-c63a-4572-9511-5e5b3be487fb","Type":"ContainerStarted","Data":"a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662"} Nov 27 11:15:22 crc kubenswrapper[4807]: I1127 11:15:22.809954 4807 generic.go:334] "Generic (PLEG): container finished" podID="fc334283-2738-4bcb-9ad6-b0654ceb5032" containerID="24b0d33e33748d41ed2aafc94ddd287898ad50b54286b33516b5c78d8e8c004e" exitCode=0 Nov 27 11:15:22 crc kubenswrapper[4807]: I1127 11:15:22.810011 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs4rb" event={"ID":"fc334283-2738-4bcb-9ad6-b0654ceb5032","Type":"ContainerDied","Data":"24b0d33e33748d41ed2aafc94ddd287898ad50b54286b33516b5c78d8e8c004e"} Nov 27 11:15:22 crc kubenswrapper[4807]: I1127 11:15:22.827518 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j2bls" podStartSLOduration=2.011132254 podStartE2EDuration="4.827485136s" podCreationTimestamp="2025-11-27 11:15:18 +0000 UTC" firstStartedPulling="2025-11-27 11:15:19.770848906 +0000 UTC m=+360.870347104" lastFinishedPulling="2025-11-27 11:15:22.587201788 +0000 UTC m=+363.686699986" observedRunningTime="2025-11-27 11:15:22.826870249 +0000 UTC m=+363.926368487" watchObservedRunningTime="2025-11-27 11:15:22.827485136 +0000 UTC m=+363.926983334" Nov 27 11:15:24 crc kubenswrapper[4807]: I1127 11:15:24.822676 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hs4rb" event={"ID":"fc334283-2738-4bcb-9ad6-b0654ceb5032","Type":"ContainerStarted","Data":"f54cbf19f160aa2ddfec96018d505cac8e35cd21b79854179301f2b9ac341619"} Nov 27 11:15:24 crc kubenswrapper[4807]: I1127 11:15:24.845369 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hs4rb" podStartSLOduration=3.8677707740000002 podStartE2EDuration="6.84534854s" podCreationTimestamp="2025-11-27 11:15:18 +0000 UTC" firstStartedPulling="2025-11-27 11:15:20.783445552 +0000 UTC m=+361.882943750" lastFinishedPulling="2025-11-27 11:15:23.761023308 +0000 UTC m=+364.860521516" observedRunningTime="2025-11-27 11:15:24.841354917 +0000 UTC m=+365.940853115" watchObservedRunningTime="2025-11-27 11:15:24.84534854 +0000 UTC m=+365.944846748" Nov 27 11:15:26 crc kubenswrapper[4807]: I1127 11:15:26.725926 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:26 crc kubenswrapper[4807]: I1127 11:15:26.726307 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:26 crc kubenswrapper[4807]: I1127 11:15:26.772817 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:26 crc kubenswrapper[4807]: I1127 11:15:26.890774 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4bvrf" Nov 27 11:15:26 crc kubenswrapper[4807]: I1127 11:15:26.912687 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:26 crc kubenswrapper[4807]: I1127 11:15:26.912792 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:26 crc kubenswrapper[4807]: I1127 11:15:26.981129 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:27 crc kubenswrapper[4807]: I1127 11:15:27.349139 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-prrd9"] Nov 27 11:15:27 crc kubenswrapper[4807]: I1127 11:15:27.349355 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" podUID="45883014-d0c7-4ce0-ad6b-2ea83ab167ca" containerName="controller-manager" containerID="cri-o://b0030f4ed48a98ca8f59115d4b872121819ccd5290871c57d9ea1bc95def4887" gracePeriod=30 Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:27.878210 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hmsvr" Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:28.556828 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-5242m" Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:28.621044 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sdd69"] Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:28.846059 4807 generic.go:334] "Generic (PLEG): container finished" podID="45883014-d0c7-4ce0-ad6b-2ea83ab167ca" containerID="b0030f4ed48a98ca8f59115d4b872121819ccd5290871c57d9ea1bc95def4887" exitCode=0 Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:28.846185 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" event={"ID":"45883014-d0c7-4ce0-ad6b-2ea83ab167ca","Type":"ContainerDied","Data":"b0030f4ed48a98ca8f59115d4b872121819ccd5290871c57d9ea1bc95def4887"} Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:28.928682 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:28.957289 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5795874699-6jbxf"] Nov 27 11:15:28 crc kubenswrapper[4807]: E1127 11:15:28.957492 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45883014-d0c7-4ce0-ad6b-2ea83ab167ca" containerName="controller-manager" Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:28.957503 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="45883014-d0c7-4ce0-ad6b-2ea83ab167ca" containerName="controller-manager" Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:28.957591 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="45883014-d0c7-4ce0-ad6b-2ea83ab167ca" containerName="controller-manager" Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:28.958606 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:28 crc kubenswrapper[4807]: I1127 11:15:28.964704 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5795874699-6jbxf"] Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.073901 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-proxy-ca-bundles\") pod \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.073957 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-serving-cert\") pod \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.073991 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8sm4\" (UniqueName: \"kubernetes.io/projected/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-kube-api-access-j8sm4\") pod \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.074048 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-client-ca\") pod \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.074082 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-config\") pod \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\" (UID: \"45883014-d0c7-4ce0-ad6b-2ea83ab167ca\") " Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.074277 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e792fe1-bcf8-4237-9640-6a838e3badb9-config\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.074334 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e792fe1-bcf8-4237-9640-6a838e3badb9-serving-cert\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.074369 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e792fe1-bcf8-4237-9640-6a838e3badb9-client-ca\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.074395 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wvt6\" (UniqueName: \"kubernetes.io/projected/1e792fe1-bcf8-4237-9640-6a838e3badb9-kube-api-access-4wvt6\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.074545 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e792fe1-bcf8-4237-9640-6a838e3badb9-proxy-ca-bundles\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.074788 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-client-ca" (OuterVolumeSpecName: "client-ca") pod "45883014-d0c7-4ce0-ad6b-2ea83ab167ca" (UID: "45883014-d0c7-4ce0-ad6b-2ea83ab167ca"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.074856 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "45883014-d0c7-4ce0-ad6b-2ea83ab167ca" (UID: "45883014-d0c7-4ce0-ad6b-2ea83ab167ca"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.074886 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-config" (OuterVolumeSpecName: "config") pod "45883014-d0c7-4ce0-ad6b-2ea83ab167ca" (UID: "45883014-d0c7-4ce0-ad6b-2ea83ab167ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.078785 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-kube-api-access-j8sm4" (OuterVolumeSpecName: "kube-api-access-j8sm4") pod "45883014-d0c7-4ce0-ad6b-2ea83ab167ca" (UID: "45883014-d0c7-4ce0-ad6b-2ea83ab167ca"). InnerVolumeSpecName "kube-api-access-j8sm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.078835 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "45883014-d0c7-4ce0-ad6b-2ea83ab167ca" (UID: "45883014-d0c7-4ce0-ad6b-2ea83ab167ca"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.106409 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.106739 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.145934 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.175800 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e792fe1-bcf8-4237-9640-6a838e3badb9-config\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.175872 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e792fe1-bcf8-4237-9640-6a838e3badb9-serving-cert\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.175913 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e792fe1-bcf8-4237-9640-6a838e3badb9-client-ca\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.175943 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wvt6\" (UniqueName: \"kubernetes.io/projected/1e792fe1-bcf8-4237-9640-6a838e3badb9-kube-api-access-4wvt6\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.175990 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e792fe1-bcf8-4237-9640-6a838e3badb9-proxy-ca-bundles\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.176049 4807 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-client-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.176063 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.176075 4807 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.176089 4807 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.176101 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8sm4\" (UniqueName: \"kubernetes.io/projected/45883014-d0c7-4ce0-ad6b-2ea83ab167ca-kube-api-access-j8sm4\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.176943 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1e792fe1-bcf8-4237-9640-6a838e3badb9-client-ca\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.177060 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1e792fe1-bcf8-4237-9640-6a838e3badb9-proxy-ca-bundles\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.177393 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e792fe1-bcf8-4237-9640-6a838e3badb9-config\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.180232 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e792fe1-bcf8-4237-9640-6a838e3badb9-serving-cert\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.195192 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wvt6\" (UniqueName: \"kubernetes.io/projected/1e792fe1-bcf8-4237-9640-6a838e3badb9-kube-api-access-4wvt6\") pod \"controller-manager-5795874699-6jbxf\" (UID: \"1e792fe1-bcf8-4237-9640-6a838e3badb9\") " pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.281408 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.300064 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.300103 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.347446 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.679563 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5795874699-6jbxf"] Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.851209 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" event={"ID":"1e792fe1-bcf8-4237-9640-6a838e3badb9","Type":"ContainerStarted","Data":"062edb4ac6583191444da5e5b77b5271f8658054743c0bc6af3c4477fc649a96"} Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.851264 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" event={"ID":"1e792fe1-bcf8-4237-9640-6a838e3badb9","Type":"ContainerStarted","Data":"ce340c786e69f8ff1481f634391485178d45ec14ac1189f0d6001476b260c347"} Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.851438 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.852686 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" event={"ID":"45883014-d0c7-4ce0-ad6b-2ea83ab167ca","Type":"ContainerDied","Data":"f22f46fe820d9da4fd45da2f62753988f9ec66afc9dc41cba2c1a213fc293a16"} Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.852727 4807 scope.go:117] "RemoveContainer" containerID="b0030f4ed48a98ca8f59115d4b872121819ccd5290871c57d9ea1bc95def4887" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.852775 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-658fd5994d-prrd9" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.853170 4807 patch_prober.go:28] interesting pod/controller-manager-5795874699-6jbxf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.71:8443/healthz\": dial tcp 10.217.0.71:8443: connect: connection refused" start-of-body= Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.853237 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" podUID="1e792fe1-bcf8-4237-9640-6a838e3badb9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.71:8443/healthz\": dial tcp 10.217.0.71:8443: connect: connection refused" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.887075 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" podStartSLOduration=2.8870561070000003 podStartE2EDuration="2.887056107s" podCreationTimestamp="2025-11-27 11:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:15:29.872304751 +0000 UTC m=+370.971802959" watchObservedRunningTime="2025-11-27 11:15:29.887056107 +0000 UTC m=+370.986554305" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.888491 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-prrd9"] Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.891830 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-658fd5994d-prrd9"] Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.903643 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hs4rb" Nov 27 11:15:29 crc kubenswrapper[4807]: I1127 11:15:29.911736 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:15:30 crc kubenswrapper[4807]: I1127 11:15:30.863850 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5795874699-6jbxf" Nov 27 11:15:31 crc kubenswrapper[4807]: I1127 11:15:31.541219 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45883014-d0c7-4ce0-ad6b-2ea83ab167ca" path="/var/lib/kubelet/pods/45883014-d0c7-4ce0-ad6b-2ea83ab167ca/volumes" Nov 27 11:15:50 crc kubenswrapper[4807]: I1127 11:15:50.922039 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:15:50 crc kubenswrapper[4807]: I1127 11:15:50.922528 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:15:53 crc kubenswrapper[4807]: I1127 11:15:53.682295 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" podUID="35c0378e-2da0-4e94-8230-2db66a4c7993" containerName="registry" containerID="cri-o://1517724e1d08cf6ea9812b2f939807a259e901352ff3949c2dd4795c3a029eeb" gracePeriod=30 Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.002755 4807 generic.go:334] "Generic (PLEG): container finished" podID="35c0378e-2da0-4e94-8230-2db66a4c7993" containerID="1517724e1d08cf6ea9812b2f939807a259e901352ff3949c2dd4795c3a029eeb" exitCode=0 Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.002879 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" event={"ID":"35c0378e-2da0-4e94-8230-2db66a4c7993","Type":"ContainerDied","Data":"1517724e1d08cf6ea9812b2f939807a259e901352ff3949c2dd4795c3a029eeb"} Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.270607 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.343171 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-tls\") pod \"35c0378e-2da0-4e94-8230-2db66a4c7993\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.343276 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/35c0378e-2da0-4e94-8230-2db66a4c7993-installation-pull-secrets\") pod \"35c0378e-2da0-4e94-8230-2db66a4c7993\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.343312 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/35c0378e-2da0-4e94-8230-2db66a4c7993-ca-trust-extracted\") pod \"35c0378e-2da0-4e94-8230-2db66a4c7993\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.343336 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-certificates\") pod \"35c0378e-2da0-4e94-8230-2db66a4c7993\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.343370 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-bound-sa-token\") pod \"35c0378e-2da0-4e94-8230-2db66a4c7993\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.343492 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"35c0378e-2da0-4e94-8230-2db66a4c7993\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.343524 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c4r6\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-kube-api-access-2c4r6\") pod \"35c0378e-2da0-4e94-8230-2db66a4c7993\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.343552 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-trusted-ca\") pod \"35c0378e-2da0-4e94-8230-2db66a4c7993\" (UID: \"35c0378e-2da0-4e94-8230-2db66a4c7993\") " Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.344366 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "35c0378e-2da0-4e94-8230-2db66a4c7993" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.345110 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "35c0378e-2da0-4e94-8230-2db66a4c7993" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.350722 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "35c0378e-2da0-4e94-8230-2db66a4c7993" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.351378 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "35c0378e-2da0-4e94-8230-2db66a4c7993" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.351521 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c0378e-2da0-4e94-8230-2db66a4c7993-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "35c0378e-2da0-4e94-8230-2db66a4c7993" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.353427 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-kube-api-access-2c4r6" (OuterVolumeSpecName: "kube-api-access-2c4r6") pod "35c0378e-2da0-4e94-8230-2db66a4c7993" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993"). InnerVolumeSpecName "kube-api-access-2c4r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.358973 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "35c0378e-2da0-4e94-8230-2db66a4c7993" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.367355 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35c0378e-2da0-4e94-8230-2db66a4c7993-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "35c0378e-2da0-4e94-8230-2db66a4c7993" (UID: "35c0378e-2da0-4e94-8230-2db66a4c7993"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.444371 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c4r6\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-kube-api-access-2c4r6\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.444404 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.444413 4807 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.444422 4807 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/35c0378e-2da0-4e94-8230-2db66a4c7993-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.444430 4807 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/35c0378e-2da0-4e94-8230-2db66a4c7993-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.444438 4807 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/35c0378e-2da0-4e94-8230-2db66a4c7993-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:54 crc kubenswrapper[4807]: I1127 11:15:54.444446 4807 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/35c0378e-2da0-4e94-8230-2db66a4c7993-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 27 11:15:55 crc kubenswrapper[4807]: I1127 11:15:55.013087 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" event={"ID":"35c0378e-2da0-4e94-8230-2db66a4c7993","Type":"ContainerDied","Data":"9b539da781fd854f195e52cb89a261a9a03b478705473a6cdd281f5f55f90437"} Nov 27 11:15:55 crc kubenswrapper[4807]: I1127 11:15:55.013164 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-sdd69" Nov 27 11:15:55 crc kubenswrapper[4807]: I1127 11:15:55.013661 4807 scope.go:117] "RemoveContainer" containerID="1517724e1d08cf6ea9812b2f939807a259e901352ff3949c2dd4795c3a029eeb" Nov 27 11:15:55 crc kubenswrapper[4807]: I1127 11:15:55.065311 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sdd69"] Nov 27 11:15:55 crc kubenswrapper[4807]: I1127 11:15:55.072008 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-sdd69"] Nov 27 11:15:55 crc kubenswrapper[4807]: I1127 11:15:55.543875 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c0378e-2da0-4e94-8230-2db66a4c7993" path="/var/lib/kubelet/pods/35c0378e-2da0-4e94-8230-2db66a4c7993/volumes" Nov 27 11:16:20 crc kubenswrapper[4807]: I1127 11:16:20.921751 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:16:20 crc kubenswrapper[4807]: I1127 11:16:20.922654 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:16:20 crc kubenswrapper[4807]: I1127 11:16:20.922776 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:16:20 crc kubenswrapper[4807]: I1127 11:16:20.925378 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc81507d52c9f1bfb0bb1ff2c6a207a6a959b377cb7154504d0530b5e35f12d5"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:16:20 crc kubenswrapper[4807]: I1127 11:16:20.925452 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://cc81507d52c9f1bfb0bb1ff2c6a207a6a959b377cb7154504d0530b5e35f12d5" gracePeriod=600 Nov 27 11:16:21 crc kubenswrapper[4807]: I1127 11:16:21.181409 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="cc81507d52c9f1bfb0bb1ff2c6a207a6a959b377cb7154504d0530b5e35f12d5" exitCode=0 Nov 27 11:16:21 crc kubenswrapper[4807]: I1127 11:16:21.181461 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"cc81507d52c9f1bfb0bb1ff2c6a207a6a959b377cb7154504d0530b5e35f12d5"} Nov 27 11:16:21 crc kubenswrapper[4807]: I1127 11:16:21.181498 4807 scope.go:117] "RemoveContainer" containerID="bd76c06730caf399f3a17ead7d16a5afd905255fba63cbd15a3c92f8f88dbe2e" Nov 27 11:16:22 crc kubenswrapper[4807]: I1127 11:16:22.187645 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"950bd2e48636b06df84e0002296e518c85524a06e8b0c7352cd93856ad7f71ef"} Nov 27 11:18:19 crc kubenswrapper[4807]: I1127 11:18:19.744036 4807 scope.go:117] "RemoveContainer" containerID="b69ce7230ed8711a77515a9608bc519b5fe92f85a9bd5ff61d9d6364bd5955aa" Nov 27 11:18:50 crc kubenswrapper[4807]: I1127 11:18:50.921296 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:18:50 crc kubenswrapper[4807]: I1127 11:18:50.922060 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.491778 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-cbsmh"] Nov 27 11:19:08 crc kubenswrapper[4807]: E1127 11:19:08.493668 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c0378e-2da0-4e94-8230-2db66a4c7993" containerName="registry" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.493683 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c0378e-2da0-4e94-8230-2db66a4c7993" containerName="registry" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.493782 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c0378e-2da0-4e94-8230-2db66a4c7993" containerName="registry" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.494148 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-cbsmh" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.496058 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.496272 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.496387 4807 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2mn9s" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.498989 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-czgf5"] Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.499613 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-czgf5" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.501136 4807 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6fbrq" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.503636 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-cbsmh"] Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.511784 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nwhxv"] Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.512382 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwhxv" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.513617 4807 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-nxlk2" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.521700 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-czgf5"] Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.527974 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nwhxv"] Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.593448 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9vn\" (UniqueName: \"kubernetes.io/projected/bc3f48f0-2d12-4c07-bfb4-20914aeaf910-kube-api-access-zh9vn\") pod \"cert-manager-5b446d88c5-czgf5\" (UID: \"bc3f48f0-2d12-4c07-bfb4-20914aeaf910\") " pod="cert-manager/cert-manager-5b446d88c5-czgf5" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.593490 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj8n4\" (UniqueName: \"kubernetes.io/projected/8285736f-5d32-4503-94dd-f3e7c5d6a8f0-kube-api-access-rj8n4\") pod \"cert-manager-cainjector-7f985d654d-cbsmh\" (UID: \"8285736f-5d32-4503-94dd-f3e7c5d6a8f0\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-cbsmh" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.593516 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r8jr\" (UniqueName: \"kubernetes.io/projected/622a1ad1-bedf-4836-aa6e-0257f4694ae9-kube-api-access-4r8jr\") pod \"cert-manager-webhook-5655c58dd6-nwhxv\" (UID: \"622a1ad1-bedf-4836-aa6e-0257f4694ae9\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nwhxv" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.694821 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9vn\" (UniqueName: \"kubernetes.io/projected/bc3f48f0-2d12-4c07-bfb4-20914aeaf910-kube-api-access-zh9vn\") pod \"cert-manager-5b446d88c5-czgf5\" (UID: \"bc3f48f0-2d12-4c07-bfb4-20914aeaf910\") " pod="cert-manager/cert-manager-5b446d88c5-czgf5" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.695088 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj8n4\" (UniqueName: \"kubernetes.io/projected/8285736f-5d32-4503-94dd-f3e7c5d6a8f0-kube-api-access-rj8n4\") pod \"cert-manager-cainjector-7f985d654d-cbsmh\" (UID: \"8285736f-5d32-4503-94dd-f3e7c5d6a8f0\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-cbsmh" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.695200 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r8jr\" (UniqueName: \"kubernetes.io/projected/622a1ad1-bedf-4836-aa6e-0257f4694ae9-kube-api-access-4r8jr\") pod \"cert-manager-webhook-5655c58dd6-nwhxv\" (UID: \"622a1ad1-bedf-4836-aa6e-0257f4694ae9\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nwhxv" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.712457 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r8jr\" (UniqueName: \"kubernetes.io/projected/622a1ad1-bedf-4836-aa6e-0257f4694ae9-kube-api-access-4r8jr\") pod \"cert-manager-webhook-5655c58dd6-nwhxv\" (UID: \"622a1ad1-bedf-4836-aa6e-0257f4694ae9\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-nwhxv" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.713332 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj8n4\" (UniqueName: \"kubernetes.io/projected/8285736f-5d32-4503-94dd-f3e7c5d6a8f0-kube-api-access-rj8n4\") pod \"cert-manager-cainjector-7f985d654d-cbsmh\" (UID: \"8285736f-5d32-4503-94dd-f3e7c5d6a8f0\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-cbsmh" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.716295 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9vn\" (UniqueName: \"kubernetes.io/projected/bc3f48f0-2d12-4c07-bfb4-20914aeaf910-kube-api-access-zh9vn\") pod \"cert-manager-5b446d88c5-czgf5\" (UID: \"bc3f48f0-2d12-4c07-bfb4-20914aeaf910\") " pod="cert-manager/cert-manager-5b446d88c5-czgf5" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.818675 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-cbsmh" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.828356 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-czgf5" Nov 27 11:19:08 crc kubenswrapper[4807]: I1127 11:19:08.841472 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwhxv" Nov 27 11:19:09 crc kubenswrapper[4807]: I1127 11:19:09.057202 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-czgf5"] Nov 27 11:19:09 crc kubenswrapper[4807]: I1127 11:19:09.069205 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 11:19:09 crc kubenswrapper[4807]: I1127 11:19:09.092999 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-cbsmh"] Nov 27 11:19:09 crc kubenswrapper[4807]: W1127 11:19:09.095408 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8285736f_5d32_4503_94dd_f3e7c5d6a8f0.slice/crio-53c65c860b26e08bc249379478715ba80d98b6598845ee6270f6e5d5e363754d WatchSource:0}: Error finding container 53c65c860b26e08bc249379478715ba80d98b6598845ee6270f6e5d5e363754d: Status 404 returned error can't find the container with id 53c65c860b26e08bc249379478715ba80d98b6598845ee6270f6e5d5e363754d Nov 27 11:19:09 crc kubenswrapper[4807]: I1127 11:19:09.139008 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-nwhxv"] Nov 27 11:19:09 crc kubenswrapper[4807]: W1127 11:19:09.150628 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod622a1ad1_bedf_4836_aa6e_0257f4694ae9.slice/crio-c358906f7eb8d9faeb30e999920a7efe06246d0f7610d96bcd81d22d2019294f WatchSource:0}: Error finding container c358906f7eb8d9faeb30e999920a7efe06246d0f7610d96bcd81d22d2019294f: Status 404 returned error can't find the container with id c358906f7eb8d9faeb30e999920a7efe06246d0f7610d96bcd81d22d2019294f Nov 27 11:19:09 crc kubenswrapper[4807]: I1127 11:19:09.208522 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-czgf5" event={"ID":"bc3f48f0-2d12-4c07-bfb4-20914aeaf910","Type":"ContainerStarted","Data":"f54c6ef50b6f671e069da0d0bc22f9f11016b0178e5f3c3787cc4ac65e8dfa0d"} Nov 27 11:19:09 crc kubenswrapper[4807]: I1127 11:19:09.210254 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwhxv" event={"ID":"622a1ad1-bedf-4836-aa6e-0257f4694ae9","Type":"ContainerStarted","Data":"c358906f7eb8d9faeb30e999920a7efe06246d0f7610d96bcd81d22d2019294f"} Nov 27 11:19:09 crc kubenswrapper[4807]: I1127 11:19:09.212860 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-cbsmh" event={"ID":"8285736f-5d32-4503-94dd-f3e7c5d6a8f0","Type":"ContainerStarted","Data":"53c65c860b26e08bc249379478715ba80d98b6598845ee6270f6e5d5e363754d"} Nov 27 11:19:13 crc kubenswrapper[4807]: I1127 11:19:13.243724 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-czgf5" event={"ID":"bc3f48f0-2d12-4c07-bfb4-20914aeaf910","Type":"ContainerStarted","Data":"b03aec26022242e84b38c924903a3292a7c0b495fe7e76fc99dd2c28529aede4"} Nov 27 11:19:13 crc kubenswrapper[4807]: I1127 11:19:13.245670 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwhxv" event={"ID":"622a1ad1-bedf-4836-aa6e-0257f4694ae9","Type":"ContainerStarted","Data":"2cc56e258d45d5798b18aed6c518810eb3e52302df61e499f81c4200444c801e"} Nov 27 11:19:13 crc kubenswrapper[4807]: I1127 11:19:13.245785 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwhxv" Nov 27 11:19:13 crc kubenswrapper[4807]: I1127 11:19:13.246977 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-cbsmh" event={"ID":"8285736f-5d32-4503-94dd-f3e7c5d6a8f0","Type":"ContainerStarted","Data":"76e2946e0b08cb92542ca82466aed124f89b32c69c1b970fbeb7b3444bbcf459"} Nov 27 11:19:13 crc kubenswrapper[4807]: I1127 11:19:13.264520 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-czgf5" podStartSLOduration=2.2500873820000002 podStartE2EDuration="5.264497856s" podCreationTimestamp="2025-11-27 11:19:08 +0000 UTC" firstStartedPulling="2025-11-27 11:19:09.069019736 +0000 UTC m=+590.168517924" lastFinishedPulling="2025-11-27 11:19:12.0834302 +0000 UTC m=+593.182928398" observedRunningTime="2025-11-27 11:19:13.263048046 +0000 UTC m=+594.362546244" watchObservedRunningTime="2025-11-27 11:19:13.264497856 +0000 UTC m=+594.363996074" Nov 27 11:19:13 crc kubenswrapper[4807]: I1127 11:19:13.278882 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwhxv" podStartSLOduration=2.346663021 podStartE2EDuration="5.278863373s" podCreationTimestamp="2025-11-27 11:19:08 +0000 UTC" firstStartedPulling="2025-11-27 11:19:09.153031368 +0000 UTC m=+590.252529566" lastFinishedPulling="2025-11-27 11:19:12.08523172 +0000 UTC m=+593.184729918" observedRunningTime="2025-11-27 11:19:13.278717759 +0000 UTC m=+594.378215967" watchObservedRunningTime="2025-11-27 11:19:13.278863373 +0000 UTC m=+594.378361561" Nov 27 11:19:13 crc kubenswrapper[4807]: I1127 11:19:13.295514 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-cbsmh" podStartSLOduration=2.265479087 podStartE2EDuration="5.295498183s" podCreationTimestamp="2025-11-27 11:19:08 +0000 UTC" firstStartedPulling="2025-11-27 11:19:09.098150581 +0000 UTC m=+590.197648779" lastFinishedPulling="2025-11-27 11:19:12.128169677 +0000 UTC m=+593.227667875" observedRunningTime="2025-11-27 11:19:13.293194339 +0000 UTC m=+594.392692547" watchObservedRunningTime="2025-11-27 11:19:13.295498183 +0000 UTC m=+594.394996381" Nov 27 11:19:18 crc kubenswrapper[4807]: I1127 11:19:18.844436 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-nwhxv" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.135965 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lwph9"] Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.136470 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovn-controller" containerID="cri-o://b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e" gracePeriod=30 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.136609 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovn-acl-logging" containerID="cri-o://53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581" gracePeriod=30 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.136640 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="kube-rbac-proxy-node" containerID="cri-o://5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4" gracePeriod=30 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.136728 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="sbdb" containerID="cri-o://e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3" gracePeriod=30 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.136656 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416" gracePeriod=30 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.136803 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="nbdb" containerID="cri-o://2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6" gracePeriod=30 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.136901 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="northd" containerID="cri-o://9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103" gracePeriod=30 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.204032 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" containerID="cri-o://56ff2fa9700366e563601890e830cfce95805680d32c1d0bc0fa275c8cf55984" gracePeriod=30 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.281880 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmngf_97f15cbb-220e-47db-b418-3a5aa4eb55a2/kube-multus/2.log" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.284927 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmngf_97f15cbb-220e-47db-b418-3a5aa4eb55a2/kube-multus/1.log" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.284964 4807 generic.go:334] "Generic (PLEG): container finished" podID="97f15cbb-220e-47db-b418-3a5aa4eb55a2" containerID="abc43243ac432a6c5ac5ce257d5f7461ab581a61f2fd55bf1613a430d20c13c4" exitCode=2 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.285035 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmngf" event={"ID":"97f15cbb-220e-47db-b418-3a5aa4eb55a2","Type":"ContainerDied","Data":"abc43243ac432a6c5ac5ce257d5f7461ab581a61f2fd55bf1613a430d20c13c4"} Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.285073 4807 scope.go:117] "RemoveContainer" containerID="a509853063b75406f7fc467d6ab041935f8cee585f17f27c8618916093d4a624" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.285680 4807 scope.go:117] "RemoveContainer" containerID="abc43243ac432a6c5ac5ce257d5f7461ab581a61f2fd55bf1613a430d20c13c4" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.285889 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xmngf_openshift-multus(97f15cbb-220e-47db-b418-3a5aa4eb55a2)\"" pod="openshift-multus/multus-xmngf" podUID="97f15cbb-220e-47db-b418-3a5aa4eb55a2" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.288163 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/3.log" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.290522 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovn-controller/0.log" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.290866 4807 generic.go:334] "Generic (PLEG): container finished" podID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerID="6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416" exitCode=0 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.290895 4807 generic.go:334] "Generic (PLEG): container finished" podID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerID="5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4" exitCode=0 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.290907 4807 generic.go:334] "Generic (PLEG): container finished" podID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerID="b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e" exitCode=143 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.290931 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416"} Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.290962 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4"} Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.290972 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e"} Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.422497 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6 is running failed: container process not found" containerID="2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.422579 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3 is running failed: container process not found" containerID="e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.423447 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3 is running failed: container process not found" containerID="e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.423463 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6 is running failed: container process not found" containerID="2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.423741 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6 is running failed: container process not found" containerID="2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.423770 4807 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="nbdb" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.423886 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3 is running failed: container process not found" containerID="e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.423985 4807 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="sbdb" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.493556 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovnkube-controller/3.log" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.495810 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovn-acl-logging/0.log" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.496219 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lwph9_9c85b740-1df9-4ae7-a51b-fdfd89668d64/ovn-controller/0.log" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.496584 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529703 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-openvswitch\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529775 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-log-socket\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529837 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-bin\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529871 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-ovn\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529884 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-log-socket" (OuterVolumeSpecName: "log-socket") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529893 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529916 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-netns\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529932 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529945 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nmsn\" (UniqueName: \"kubernetes.io/projected/9c85b740-1df9-4ae7-a51b-fdfd89668d64-kube-api-access-7nmsn\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529956 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529952 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.529969 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-kubelet\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530021 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-netd\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530065 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530083 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-node-log\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530096 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530111 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-var-lib-openvswitch\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530122 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-node-log" (OuterVolumeSpecName: "node-log") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530134 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-ovn-kubernetes\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530182 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530193 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530216 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530239 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-systemd\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530259 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530324 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-script-lib\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530358 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-config\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530380 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-systemd-units\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530444 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovn-node-metrics-cert\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530522 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-slash\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530545 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-etc-openvswitch\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.531750 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-env-overrides\") pod \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\" (UID: \"9c85b740-1df9-4ae7-a51b-fdfd89668d64\") " Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532253 4807 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-node-log\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532278 4807 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532291 4807 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532332 4807 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532346 4807 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532359 4807 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-log-socket\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532370 4807 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532406 4807 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532420 4807 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532431 4807 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.532441 4807 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530595 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530833 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530837 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-slash" (OuterVolumeSpecName: "host-slash") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.530973 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.531425 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.533011 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.535442 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c85b740-1df9-4ae7-a51b-fdfd89668d64-kube-api-access-7nmsn" (OuterVolumeSpecName: "kube-api-access-7nmsn") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "kube-api-access-7nmsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.537744 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550386 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nv9hp"] Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550626 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovn-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550639 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovn-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550653 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550660 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550669 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550676 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550687 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550694 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550702 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovn-acl-logging" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550709 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovn-acl-logging" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550723 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="kubecfg-setup" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550730 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="kubecfg-setup" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550740 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="kube-rbac-proxy-node" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550747 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="kube-rbac-proxy-node" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550758 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="northd" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550765 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="northd" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550772 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="kube-rbac-proxy-ovn-metrics" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550780 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="kube-rbac-proxy-ovn-metrics" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550791 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="sbdb" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550798 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="sbdb" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550806 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="nbdb" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550812 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="nbdb" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.550823 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550829 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550934 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550944 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="northd" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550957 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="kube-rbac-proxy-node" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550966 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovn-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550975 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="kube-rbac-proxy-ovn-metrics" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550983 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="nbdb" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.550993 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovn-acl-logging" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.551005 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="sbdb" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.551016 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.551029 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: E1127 11:19:19.551136 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.551144 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.551258 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.551277 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" containerName="ovnkube-controller" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.553200 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.553454 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "9c85b740-1df9-4ae7-a51b-fdfd89668d64" (UID: "9c85b740-1df9-4ae7-a51b-fdfd89668d64"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.633910 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e936637-423f-4d36-848a-40517ca983fc-ovn-node-metrics-cert\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634192 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634222 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e936637-423f-4d36-848a-40517ca983fc-env-overrides\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634270 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-run-systemd\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634292 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e936637-423f-4d36-848a-40517ca983fc-ovnkube-config\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634320 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-kubelet\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634364 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-run-ovn-kubernetes\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634398 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-var-lib-openvswitch\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634422 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e936637-423f-4d36-848a-40517ca983fc-ovnkube-script-lib\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634464 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-cni-bin\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634486 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-log-socket\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634520 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2drw\" (UniqueName: \"kubernetes.io/projected/3e936637-423f-4d36-848a-40517ca983fc-kube-api-access-h2drw\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634542 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-slash\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634578 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-run-netns\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634601 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-run-openvswitch\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634621 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-systemd-units\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634648 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-run-ovn\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634661 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-node-log\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634675 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-etc-openvswitch\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634692 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-cni-netd\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634725 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nmsn\" (UniqueName: \"kubernetes.io/projected/9c85b740-1df9-4ae7-a51b-fdfd89668d64-kube-api-access-7nmsn\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634736 4807 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634745 4807 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634754 4807 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634764 4807 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634772 4807 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c85b740-1df9-4ae7-a51b-fdfd89668d64-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634780 4807 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-host-slash\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634788 4807 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c85b740-1df9-4ae7-a51b-fdfd89668d64-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.634797 4807 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c85b740-1df9-4ae7-a51b-fdfd89668d64-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736284 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-run-ovn\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736364 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-run-ovn\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736536 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-node-log\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736613 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-etc-openvswitch\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736638 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-cni-netd\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736671 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e936637-423f-4d36-848a-40517ca983fc-ovn-node-metrics-cert\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736722 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736755 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e936637-423f-4d36-848a-40517ca983fc-env-overrides\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736780 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-run-systemd\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736802 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e936637-423f-4d36-848a-40517ca983fc-ovnkube-config\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736808 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-cni-netd\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736826 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-kubelet\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736852 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736869 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-run-ovn-kubernetes\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736887 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-run-systemd\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736907 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-var-lib-openvswitch\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736926 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e936637-423f-4d36-848a-40517ca983fc-ovnkube-script-lib\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736946 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-cni-bin\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736962 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-log-socket\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736976 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2drw\" (UniqueName: \"kubernetes.io/projected/3e936637-423f-4d36-848a-40517ca983fc-kube-api-access-h2drw\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736991 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-slash\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737010 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-run-netns\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737023 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-run-openvswitch\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737043 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-systemd-units\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737098 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-systemd-units\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737123 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-kubelet\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737166 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-run-ovn-kubernetes\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.736869 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-etc-openvswitch\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737194 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-var-lib-openvswitch\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737429 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-cni-bin\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737458 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-run-netns\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737482 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-log-socket\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737509 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e936637-423f-4d36-848a-40517ca983fc-ovnkube-config\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737636 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e936637-423f-4d36-848a-40517ca983fc-env-overrides\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737687 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-run-openvswitch\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.737951 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e936637-423f-4d36-848a-40517ca983fc-ovnkube-script-lib\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.738439 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-host-slash\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.738607 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e936637-423f-4d36-848a-40517ca983fc-node-log\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.740660 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e936637-423f-4d36-848a-40517ca983fc-ovn-node-metrics-cert\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.753680 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2drw\" (UniqueName: \"kubernetes.io/projected/3e936637-423f-4d36-848a-40517ca983fc-kube-api-access-h2drw\") pod \"ovnkube-node-nv9hp\" (UID: \"3e936637-423f-4d36-848a-40517ca983fc\") " pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.806713 4807 scope.go:117] "RemoveContainer" containerID="a83ff78861d7505599d68f3547c22df618b019a7049a75f001984031a5f489a9" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.828422 4807 scope.go:117] "RemoveContainer" containerID="56ff2fa9700366e563601890e830cfce95805680d32c1d0bc0fa275c8cf55984" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.848584 4807 scope.go:117] "RemoveContainer" containerID="53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.863574 4807 scope.go:117] "RemoveContainer" containerID="dfce079411e27707c0e7f0150836430324e68ccb3b21a2e3a4c99c4ac9de7447" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.866492 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.894259 4807 scope.go:117] "RemoveContainer" containerID="2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6" Nov 27 11:19:19 crc kubenswrapper[4807]: W1127 11:19:19.897596 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e936637_423f_4d36_848a_40517ca983fc.slice/crio-5bf98c44dbe7e6a516100b92ccc431a029fb993b5e9aeafaaf2ee340c2075e87 WatchSource:0}: Error finding container 5bf98c44dbe7e6a516100b92ccc431a029fb993b5e9aeafaaf2ee340c2075e87: Status 404 returned error can't find the container with id 5bf98c44dbe7e6a516100b92ccc431a029fb993b5e9aeafaaf2ee340c2075e87 Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.908417 4807 scope.go:117] "RemoveContainer" containerID="9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.927985 4807 scope.go:117] "RemoveContainer" containerID="5172d216a4c724de7d0ae42dd761cc2f221bb816e67d94f54d3b01a69a4e7ad4" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.945120 4807 scope.go:117] "RemoveContainer" containerID="b32b340f26968b1d59c83401b7ba43f86e21198ea52084dbc9d7f7e10c434d2e" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.958727 4807 scope.go:117] "RemoveContainer" containerID="e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3" Nov 27 11:19:19 crc kubenswrapper[4807]: I1127 11:19:19.981280 4807 scope.go:117] "RemoveContainer" containerID="6459b8e45cbc982c35c430ae549227daee4e07f0ec0e31c36658de97eac04416" Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.299265 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmngf_97f15cbb-220e-47db-b418-3a5aa4eb55a2/kube-multus/2.log" Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.299351 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"56ff2fa9700366e563601890e830cfce95805680d32c1d0bc0fa275c8cf55984"} Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.299378 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"e43fb9dfbe9764c53c4b3146cbc440e3cc6199bfcf1dd426ca636bd6bdbb5dc3"} Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.299388 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"2d857b9fb5a3f4014118adeb2e6a8e35e678e8cd774aabed830a7cf2d42d2df6"} Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.299397 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"9cd66332df5a6f85479d43b03d405a3ea953d728a37ec0665d629187faf7f103"} Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.299408 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"53d0d1807abeef56a9cd394920cd8a4abeea9f950c6501cb2c5009feb4e8d581"} Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.299418 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" event={"ID":"9c85b740-1df9-4ae7-a51b-fdfd89668d64","Type":"ContainerDied","Data":"0a5dff762776b30ad73f861bcc7e54e32ac6c0768a16a96458c906d041cee704"} Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.303483 4807 generic.go:334] "Generic (PLEG): container finished" podID="3e936637-423f-4d36-848a-40517ca983fc" containerID="2d0b0306f693b0138705c1d9b25811a34f8707be40747c5df4c90f2b42e2796f" exitCode=0 Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.303562 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" event={"ID":"3e936637-423f-4d36-848a-40517ca983fc","Type":"ContainerDied","Data":"2d0b0306f693b0138705c1d9b25811a34f8707be40747c5df4c90f2b42e2796f"} Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.303585 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lwph9" Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.303596 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" event={"ID":"3e936637-423f-4d36-848a-40517ca983fc","Type":"ContainerStarted","Data":"5bf98c44dbe7e6a516100b92ccc431a029fb993b5e9aeafaaf2ee340c2075e87"} Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.379363 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lwph9"] Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.383898 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lwph9"] Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.921170 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:19:20 crc kubenswrapper[4807]: I1127 11:19:20.921220 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:19:21 crc kubenswrapper[4807]: I1127 11:19:21.312128 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" event={"ID":"3e936637-423f-4d36-848a-40517ca983fc","Type":"ContainerStarted","Data":"2c60e331712e572c8e37543f245a70dd72728388f5dadab079fc37e4a94ef819"} Nov 27 11:19:21 crc kubenswrapper[4807]: I1127 11:19:21.312507 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" event={"ID":"3e936637-423f-4d36-848a-40517ca983fc","Type":"ContainerStarted","Data":"489448936a03255fd701e4559460e1dfcca62031dd757a494d14b6f2c89685a4"} Nov 27 11:19:21 crc kubenswrapper[4807]: I1127 11:19:21.312523 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" event={"ID":"3e936637-423f-4d36-848a-40517ca983fc","Type":"ContainerStarted","Data":"3e257019dd9eb5632b84aa67a71c08c394a4269fba1dfa6396aba9c862a00f6d"} Nov 27 11:19:21 crc kubenswrapper[4807]: I1127 11:19:21.312539 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" event={"ID":"3e936637-423f-4d36-848a-40517ca983fc","Type":"ContainerStarted","Data":"794b95266eefa9e9a237fecf7a87b1376a9ec4717ead6c80749a880cc4bdada9"} Nov 27 11:19:21 crc kubenswrapper[4807]: I1127 11:19:21.312548 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" event={"ID":"3e936637-423f-4d36-848a-40517ca983fc","Type":"ContainerStarted","Data":"40a6301b592b3ba1cfcc506923a2c28477f9c968db331709f7e25717e943fdb1"} Nov 27 11:19:21 crc kubenswrapper[4807]: I1127 11:19:21.312555 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" event={"ID":"3e936637-423f-4d36-848a-40517ca983fc","Type":"ContainerStarted","Data":"3e36e2449488c2dc7be753ef5f362e2a073df4cdea9500d8afcafd33fa976400"} Nov 27 11:19:21 crc kubenswrapper[4807]: I1127 11:19:21.540973 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c85b740-1df9-4ae7-a51b-fdfd89668d64" path="/var/lib/kubelet/pods/9c85b740-1df9-4ae7-a51b-fdfd89668d64/volumes" Nov 27 11:19:23 crc kubenswrapper[4807]: I1127 11:19:23.326405 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" event={"ID":"3e936637-423f-4d36-848a-40517ca983fc","Type":"ContainerStarted","Data":"2bc8656df00a49f603db56b2a1eb30d4a1b69ab4497b2364710cd902e823a419"} Nov 27 11:19:26 crc kubenswrapper[4807]: I1127 11:19:26.346949 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" event={"ID":"3e936637-423f-4d36-848a-40517ca983fc","Type":"ContainerStarted","Data":"e7953227f3da1fb044117e60c4518de202364781a3ff3df76681af842507191b"} Nov 27 11:19:26 crc kubenswrapper[4807]: I1127 11:19:26.347495 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:26 crc kubenswrapper[4807]: I1127 11:19:26.347512 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:26 crc kubenswrapper[4807]: I1127 11:19:26.390628 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:26 crc kubenswrapper[4807]: I1127 11:19:26.396307 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" podStartSLOduration=7.396289609 podStartE2EDuration="7.396289609s" podCreationTimestamp="2025-11-27 11:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:19:26.39161406 +0000 UTC m=+607.491112278" watchObservedRunningTime="2025-11-27 11:19:26.396289609 +0000 UTC m=+607.495787817" Nov 27 11:19:27 crc kubenswrapper[4807]: I1127 11:19:27.352328 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:27 crc kubenswrapper[4807]: I1127 11:19:27.379676 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:30 crc kubenswrapper[4807]: I1127 11:19:30.532887 4807 scope.go:117] "RemoveContainer" containerID="abc43243ac432a6c5ac5ce257d5f7461ab581a61f2fd55bf1613a430d20c13c4" Nov 27 11:19:30 crc kubenswrapper[4807]: E1127 11:19:30.533975 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xmngf_openshift-multus(97f15cbb-220e-47db-b418-3a5aa4eb55a2)\"" pod="openshift-multus/multus-xmngf" podUID="97f15cbb-220e-47db-b418-3a5aa4eb55a2" Nov 27 11:19:42 crc kubenswrapper[4807]: I1127 11:19:42.532240 4807 scope.go:117] "RemoveContainer" containerID="abc43243ac432a6c5ac5ce257d5f7461ab581a61f2fd55bf1613a430d20c13c4" Nov 27 11:19:43 crc kubenswrapper[4807]: I1127 11:19:43.438002 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xmngf_97f15cbb-220e-47db-b418-3a5aa4eb55a2/kube-multus/2.log" Nov 27 11:19:43 crc kubenswrapper[4807]: I1127 11:19:43.438746 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xmngf" event={"ID":"97f15cbb-220e-47db-b418-3a5aa4eb55a2","Type":"ContainerStarted","Data":"3d1d021ea7c5a3e2e13b731dd69e1241ea1178a481beede7e5b34c9cb8905d25"} Nov 27 11:19:49 crc kubenswrapper[4807]: I1127 11:19:49.891019 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nv9hp" Nov 27 11:19:50 crc kubenswrapper[4807]: I1127 11:19:50.921935 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:19:50 crc kubenswrapper[4807]: I1127 11:19:50.922408 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:19:50 crc kubenswrapper[4807]: I1127 11:19:50.922473 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:19:50 crc kubenswrapper[4807]: I1127 11:19:50.924035 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"950bd2e48636b06df84e0002296e518c85524a06e8b0c7352cd93856ad7f71ef"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:19:50 crc kubenswrapper[4807]: I1127 11:19:50.924109 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://950bd2e48636b06df84e0002296e518c85524a06e8b0c7352cd93856ad7f71ef" gracePeriod=600 Nov 27 11:19:51 crc kubenswrapper[4807]: I1127 11:19:51.482179 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="950bd2e48636b06df84e0002296e518c85524a06e8b0c7352cd93856ad7f71ef" exitCode=0 Nov 27 11:19:51 crc kubenswrapper[4807]: I1127 11:19:51.482273 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"950bd2e48636b06df84e0002296e518c85524a06e8b0c7352cd93856ad7f71ef"} Nov 27 11:19:51 crc kubenswrapper[4807]: I1127 11:19:51.482459 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"5c16a271e2f512c2b0b496bddb1e050219d71c041d2a908668448dc5280aeab0"} Nov 27 11:19:51 crc kubenswrapper[4807]: I1127 11:19:51.482479 4807 scope.go:117] "RemoveContainer" containerID="cc81507d52c9f1bfb0bb1ff2c6a207a6a959b377cb7154504d0530b5e35f12d5" Nov 27 11:19:57 crc kubenswrapper[4807]: I1127 11:19:57.924870 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr"] Nov 27 11:19:57 crc kubenswrapper[4807]: I1127 11:19:57.926534 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:57 crc kubenswrapper[4807]: I1127 11:19:57.928362 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 27 11:19:57 crc kubenswrapper[4807]: I1127 11:19:57.940476 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr"] Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.041648 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.041725 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.041921 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29wbt\" (UniqueName: \"kubernetes.io/projected/11676299-32d9-41ed-92c6-7e3d55378519-kube-api-access-29wbt\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.143108 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29wbt\" (UniqueName: \"kubernetes.io/projected/11676299-32d9-41ed-92c6-7e3d55378519-kube-api-access-29wbt\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.143157 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.143193 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.143631 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.143791 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.161556 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29wbt\" (UniqueName: \"kubernetes.io/projected/11676299-32d9-41ed-92c6-7e3d55378519-kube-api-access-29wbt\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.293050 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.507615 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr"] Nov 27 11:19:58 crc kubenswrapper[4807]: I1127 11:19:58.532440 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" event={"ID":"11676299-32d9-41ed-92c6-7e3d55378519","Type":"ContainerStarted","Data":"5a6a4fde1a59c0b4d0472b49c8ceac77f88193e9d6f2e193169766d04dc9183b"} Nov 27 11:19:59 crc kubenswrapper[4807]: I1127 11:19:59.542185 4807 generic.go:334] "Generic (PLEG): container finished" podID="11676299-32d9-41ed-92c6-7e3d55378519" containerID="272c8e6efecd4ba99c03870eac9c07aa872ef8b8647a4a65e8f43979a7733799" exitCode=0 Nov 27 11:19:59 crc kubenswrapper[4807]: I1127 11:19:59.553031 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" event={"ID":"11676299-32d9-41ed-92c6-7e3d55378519","Type":"ContainerDied","Data":"272c8e6efecd4ba99c03870eac9c07aa872ef8b8647a4a65e8f43979a7733799"} Nov 27 11:20:01 crc kubenswrapper[4807]: I1127 11:20:01.554128 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" event={"ID":"11676299-32d9-41ed-92c6-7e3d55378519","Type":"ContainerStarted","Data":"e7540ce3aa991fe7625aaedf6b2f07e42b8c514b57ceda041a17546dd4cf3672"} Nov 27 11:20:02 crc kubenswrapper[4807]: I1127 11:20:02.563854 4807 generic.go:334] "Generic (PLEG): container finished" podID="11676299-32d9-41ed-92c6-7e3d55378519" containerID="e7540ce3aa991fe7625aaedf6b2f07e42b8c514b57ceda041a17546dd4cf3672" exitCode=0 Nov 27 11:20:02 crc kubenswrapper[4807]: I1127 11:20:02.563897 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" event={"ID":"11676299-32d9-41ed-92c6-7e3d55378519","Type":"ContainerDied","Data":"e7540ce3aa991fe7625aaedf6b2f07e42b8c514b57ceda041a17546dd4cf3672"} Nov 27 11:20:03 crc kubenswrapper[4807]: I1127 11:20:03.575514 4807 generic.go:334] "Generic (PLEG): container finished" podID="11676299-32d9-41ed-92c6-7e3d55378519" containerID="cee8f4b3f35f2679ec47b70210ec08388899f111c10dac8465e4ee23dff60938" exitCode=0 Nov 27 11:20:03 crc kubenswrapper[4807]: I1127 11:20:03.575557 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" event={"ID":"11676299-32d9-41ed-92c6-7e3d55378519","Type":"ContainerDied","Data":"cee8f4b3f35f2679ec47b70210ec08388899f111c10dac8465e4ee23dff60938"} Nov 27 11:20:04 crc kubenswrapper[4807]: I1127 11:20:04.837656 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:20:04 crc kubenswrapper[4807]: I1127 11:20:04.966808 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-bundle\") pod \"11676299-32d9-41ed-92c6-7e3d55378519\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " Nov 27 11:20:04 crc kubenswrapper[4807]: I1127 11:20:04.966900 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-util\") pod \"11676299-32d9-41ed-92c6-7e3d55378519\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " Nov 27 11:20:04 crc kubenswrapper[4807]: I1127 11:20:04.966949 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29wbt\" (UniqueName: \"kubernetes.io/projected/11676299-32d9-41ed-92c6-7e3d55378519-kube-api-access-29wbt\") pod \"11676299-32d9-41ed-92c6-7e3d55378519\" (UID: \"11676299-32d9-41ed-92c6-7e3d55378519\") " Nov 27 11:20:04 crc kubenswrapper[4807]: I1127 11:20:04.968153 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-bundle" (OuterVolumeSpecName: "bundle") pod "11676299-32d9-41ed-92c6-7e3d55378519" (UID: "11676299-32d9-41ed-92c6-7e3d55378519"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:20:04 crc kubenswrapper[4807]: I1127 11:20:04.968479 4807 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:20:04 crc kubenswrapper[4807]: I1127 11:20:04.975452 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11676299-32d9-41ed-92c6-7e3d55378519-kube-api-access-29wbt" (OuterVolumeSpecName: "kube-api-access-29wbt") pod "11676299-32d9-41ed-92c6-7e3d55378519" (UID: "11676299-32d9-41ed-92c6-7e3d55378519"). InnerVolumeSpecName "kube-api-access-29wbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:20:04 crc kubenswrapper[4807]: I1127 11:20:04.989856 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-util" (OuterVolumeSpecName: "util") pod "11676299-32d9-41ed-92c6-7e3d55378519" (UID: "11676299-32d9-41ed-92c6-7e3d55378519"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:20:05 crc kubenswrapper[4807]: I1127 11:20:05.069718 4807 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/11676299-32d9-41ed-92c6-7e3d55378519-util\") on node \"crc\" DevicePath \"\"" Nov 27 11:20:05 crc kubenswrapper[4807]: I1127 11:20:05.069770 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29wbt\" (UniqueName: \"kubernetes.io/projected/11676299-32d9-41ed-92c6-7e3d55378519-kube-api-access-29wbt\") on node \"crc\" DevicePath \"\"" Nov 27 11:20:05 crc kubenswrapper[4807]: I1127 11:20:05.591064 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" Nov 27 11:20:05 crc kubenswrapper[4807]: I1127 11:20:05.590953 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr" event={"ID":"11676299-32d9-41ed-92c6-7e3d55378519","Type":"ContainerDied","Data":"5a6a4fde1a59c0b4d0472b49c8ceac77f88193e9d6f2e193169766d04dc9183b"} Nov 27 11:20:05 crc kubenswrapper[4807]: I1127 11:20:05.592008 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a6a4fde1a59c0b4d0472b49c8ceac77f88193e9d6f2e193169766d04dc9183b" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.471800 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp"] Nov 27 11:20:09 crc kubenswrapper[4807]: E1127 11:20:09.472524 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11676299-32d9-41ed-92c6-7e3d55378519" containerName="util" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.472538 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="11676299-32d9-41ed-92c6-7e3d55378519" containerName="util" Nov 27 11:20:09 crc kubenswrapper[4807]: E1127 11:20:09.472549 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11676299-32d9-41ed-92c6-7e3d55378519" containerName="extract" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.472556 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="11676299-32d9-41ed-92c6-7e3d55378519" containerName="extract" Nov 27 11:20:09 crc kubenswrapper[4807]: E1127 11:20:09.472573 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11676299-32d9-41ed-92c6-7e3d55378519" containerName="pull" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.472579 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="11676299-32d9-41ed-92c6-7e3d55378519" containerName="pull" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.472702 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="11676299-32d9-41ed-92c6-7e3d55378519" containerName="extract" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.473094 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.479167 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.479509 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.483726 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-xrfkc" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.521285 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clj89\" (UniqueName: \"kubernetes.io/projected/f99de4c6-acb8-40aa-8c9f-c450de947993-kube-api-access-clj89\") pod \"nmstate-operator-5b5b58f5c8-6cwxp\" (UID: \"f99de4c6-acb8-40aa-8c9f-c450de947993\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.549562 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp"] Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.622565 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clj89\" (UniqueName: \"kubernetes.io/projected/f99de4c6-acb8-40aa-8c9f-c450de947993-kube-api-access-clj89\") pod \"nmstate-operator-5b5b58f5c8-6cwxp\" (UID: \"f99de4c6-acb8-40aa-8c9f-c450de947993\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.650228 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clj89\" (UniqueName: \"kubernetes.io/projected/f99de4c6-acb8-40aa-8c9f-c450de947993-kube-api-access-clj89\") pod \"nmstate-operator-5b5b58f5c8-6cwxp\" (UID: \"f99de4c6-acb8-40aa-8c9f-c450de947993\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.790705 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp" Nov 27 11:20:09 crc kubenswrapper[4807]: I1127 11:20:09.977789 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp"] Nov 27 11:20:09 crc kubenswrapper[4807]: W1127 11:20:09.984288 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf99de4c6_acb8_40aa_8c9f_c450de947993.slice/crio-82e11e8a02bec8400f333a283efb2eda9f85192b8e99f11f830144826a9e8121 WatchSource:0}: Error finding container 82e11e8a02bec8400f333a283efb2eda9f85192b8e99f11f830144826a9e8121: Status 404 returned error can't find the container with id 82e11e8a02bec8400f333a283efb2eda9f85192b8e99f11f830144826a9e8121 Nov 27 11:20:10 crc kubenswrapper[4807]: I1127 11:20:10.632145 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp" event={"ID":"f99de4c6-acb8-40aa-8c9f-c450de947993","Type":"ContainerStarted","Data":"82e11e8a02bec8400f333a283efb2eda9f85192b8e99f11f830144826a9e8121"} Nov 27 11:20:12 crc kubenswrapper[4807]: I1127 11:20:12.643774 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp" event={"ID":"f99de4c6-acb8-40aa-8c9f-c450de947993","Type":"ContainerStarted","Data":"d448b4884e0c4364778519b6bc1c4d55490329ea6131f4ded893d59077c0198a"} Nov 27 11:20:12 crc kubenswrapper[4807]: I1127 11:20:12.656976 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-6cwxp" podStartSLOduration=1.949123525 podStartE2EDuration="3.656955671s" podCreationTimestamp="2025-11-27 11:20:09 +0000 UTC" firstStartedPulling="2025-11-27 11:20:09.986834938 +0000 UTC m=+651.086333156" lastFinishedPulling="2025-11-27 11:20:11.694667104 +0000 UTC m=+652.794165302" observedRunningTime="2025-11-27 11:20:12.656221301 +0000 UTC m=+653.755719519" watchObservedRunningTime="2025-11-27 11:20:12.656955671 +0000 UTC m=+653.756453889" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.268942 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6"] Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.270611 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.282438 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bxpnp" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.285960 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6"] Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.292736 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m"] Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.294127 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.313129 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.316878 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-sm6qf"] Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.323568 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.329305 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/57fe27be-6097-4ef2-ac4a-2ff1625005a9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-w7q6m\" (UID: \"57fe27be-6097-4ef2-ac4a-2ff1625005a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.329440 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptp8p\" (UniqueName: \"kubernetes.io/projected/18345e30-1bdc-47bf-8e02-16cf6c3f1bb1-kube-api-access-ptp8p\") pod \"nmstate-metrics-7f946cbc9-rcrm6\" (UID: \"18345e30-1bdc-47bf-8e02-16cf6c3f1bb1\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.329499 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khhll\" (UniqueName: \"kubernetes.io/projected/57fe27be-6097-4ef2-ac4a-2ff1625005a9-kube-api-access-khhll\") pod \"nmstate-webhook-5f6d4c5ccb-w7q6m\" (UID: \"57fe27be-6097-4ef2-ac4a-2ff1625005a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.353647 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m"] Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.428929 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44"] Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.429581 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.430510 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-dbus-socket\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.430546 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/57fe27be-6097-4ef2-ac4a-2ff1625005a9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-w7q6m\" (UID: \"57fe27be-6097-4ef2-ac4a-2ff1625005a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.430575 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-nmstate-lock\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.430594 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-ovs-socket\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.430616 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptp8p\" (UniqueName: \"kubernetes.io/projected/18345e30-1bdc-47bf-8e02-16cf6c3f1bb1-kube-api-access-ptp8p\") pod \"nmstate-metrics-7f946cbc9-rcrm6\" (UID: \"18345e30-1bdc-47bf-8e02-16cf6c3f1bb1\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.430640 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khhll\" (UniqueName: \"kubernetes.io/projected/57fe27be-6097-4ef2-ac4a-2ff1625005a9-kube-api-access-khhll\") pod \"nmstate-webhook-5f6d4c5ccb-w7q6m\" (UID: \"57fe27be-6097-4ef2-ac4a-2ff1625005a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.430678 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llqx2\" (UniqueName: \"kubernetes.io/projected/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-kube-api-access-llqx2\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.433592 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-bmj9g" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.434401 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.438370 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.439349 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/57fe27be-6097-4ef2-ac4a-2ff1625005a9-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-w7q6m\" (UID: \"57fe27be-6097-4ef2-ac4a-2ff1625005a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.441885 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44"] Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.447937 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khhll\" (UniqueName: \"kubernetes.io/projected/57fe27be-6097-4ef2-ac4a-2ff1625005a9-kube-api-access-khhll\") pod \"nmstate-webhook-5f6d4c5ccb-w7q6m\" (UID: \"57fe27be-6097-4ef2-ac4a-2ff1625005a9\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.461613 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptp8p\" (UniqueName: \"kubernetes.io/projected/18345e30-1bdc-47bf-8e02-16cf6c3f1bb1-kube-api-access-ptp8p\") pod \"nmstate-metrics-7f946cbc9-rcrm6\" (UID: \"18345e30-1bdc-47bf-8e02-16cf6c3f1bb1\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.531353 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/902faa80-4625-495f-a0a9-b94bc50eae67-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-rvf44\" (UID: \"902faa80-4625-495f-a0a9-b94bc50eae67\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.531427 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llqx2\" (UniqueName: \"kubernetes.io/projected/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-kube-api-access-llqx2\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.531467 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-dbus-socket\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.531491 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8zhm\" (UniqueName: \"kubernetes.io/projected/902faa80-4625-495f-a0a9-b94bc50eae67-kube-api-access-s8zhm\") pod \"nmstate-console-plugin-7fbb5f6569-rvf44\" (UID: \"902faa80-4625-495f-a0a9-b94bc50eae67\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.532781 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-nmstate-lock\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.532850 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-ovs-socket\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.532901 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/902faa80-4625-495f-a0a9-b94bc50eae67-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-rvf44\" (UID: \"902faa80-4625-495f-a0a9-b94bc50eae67\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.532917 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-nmstate-lock\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.532935 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-ovs-socket\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.532969 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-dbus-socket\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.553732 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llqx2\" (UniqueName: \"kubernetes.io/projected/8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60-kube-api-access-llqx2\") pod \"nmstate-handler-sm6qf\" (UID: \"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60\") " pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.594664 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.612222 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7577485bdf-dfplg"] Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.613070 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.628131 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.633881 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/902faa80-4625-495f-a0a9-b94bc50eae67-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-rvf44\" (UID: \"902faa80-4625-495f-a0a9-b94bc50eae67\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.633924 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/902faa80-4625-495f-a0a9-b94bc50eae67-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-rvf44\" (UID: \"902faa80-4625-495f-a0a9-b94bc50eae67\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.633995 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8zhm\" (UniqueName: \"kubernetes.io/projected/902faa80-4625-495f-a0a9-b94bc50eae67-kube-api-access-s8zhm\") pod \"nmstate-console-plugin-7fbb5f6569-rvf44\" (UID: \"902faa80-4625-495f-a0a9-b94bc50eae67\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.635002 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/902faa80-4625-495f-a0a9-b94bc50eae67-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-rvf44\" (UID: \"902faa80-4625-495f-a0a9-b94bc50eae67\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.644972 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/902faa80-4625-495f-a0a9-b94bc50eae67-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-rvf44\" (UID: \"902faa80-4625-495f-a0a9-b94bc50eae67\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.652195 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.656634 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8zhm\" (UniqueName: \"kubernetes.io/projected/902faa80-4625-495f-a0a9-b94bc50eae67-kube-api-access-s8zhm\") pod \"nmstate-console-plugin-7fbb5f6569-rvf44\" (UID: \"902faa80-4625-495f-a0a9-b94bc50eae67\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.674474 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7577485bdf-dfplg"] Nov 27 11:20:18 crc kubenswrapper[4807]: W1127 11:20:18.706779 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ab1549f_5eb1_4dee_be4b_c3ed2ce50f60.slice/crio-eb822b810452c1bbeedfff24a06fc4f9534403479f93a8c66e9da8571b5665cb WatchSource:0}: Error finding container eb822b810452c1bbeedfff24a06fc4f9534403479f93a8c66e9da8571b5665cb: Status 404 returned error can't find the container with id eb822b810452c1bbeedfff24a06fc4f9534403479f93a8c66e9da8571b5665cb Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.735530 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-oauth-serving-cert\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.735588 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-trusted-ca-bundle\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.735614 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-service-ca\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.735660 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-console-serving-cert\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.735680 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-console-oauth-config\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.735707 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knbcc\" (UniqueName: \"kubernetes.io/projected/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-kube-api-access-knbcc\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.735752 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-console-config\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.788890 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.837190 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-trusted-ca-bundle\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.837227 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-service-ca\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.837286 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-console-serving-cert\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.837302 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-console-oauth-config\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.837321 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-console-config\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.837336 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knbcc\" (UniqueName: \"kubernetes.io/projected/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-kube-api-access-knbcc\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.837385 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-oauth-serving-cert\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.838734 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-trusted-ca-bundle\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.839532 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-service-ca\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.842345 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-console-config\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.842817 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-console-oauth-config\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.840549 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-oauth-serving-cert\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.854757 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-console-serving-cert\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.855916 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knbcc\" (UniqueName: \"kubernetes.io/projected/4a57400b-af10-44e2-9c70-8b3d46a0bd6c-kube-api-access-knbcc\") pod \"console-7577485bdf-dfplg\" (UID: \"4a57400b-af10-44e2-9c70-8b3d46a0bd6c\") " pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.856898 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6"] Nov 27 11:20:18 crc kubenswrapper[4807]: W1127 11:20:18.865435 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18345e30_1bdc_47bf_8e02_16cf6c3f1bb1.slice/crio-362cfa977d43fee62cb6363e221fb4552f7c1bb12c22846295f4b6cc26067615 WatchSource:0}: Error finding container 362cfa977d43fee62cb6363e221fb4552f7c1bb12c22846295f4b6cc26067615: Status 404 returned error can't find the container with id 362cfa977d43fee62cb6363e221fb4552f7c1bb12c22846295f4b6cc26067615 Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.951137 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:18 crc kubenswrapper[4807]: I1127 11:20:18.974836 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44"] Nov 27 11:20:18 crc kubenswrapper[4807]: W1127 11:20:18.983020 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod902faa80_4625_495f_a0a9_b94bc50eae67.slice/crio-f79a5538344beb20ccb0a9473510700ede3d86fe8b3c4ee2c2e3995ac720c827 WatchSource:0}: Error finding container f79a5538344beb20ccb0a9473510700ede3d86fe8b3c4ee2c2e3995ac720c827: Status 404 returned error can't find the container with id f79a5538344beb20ccb0a9473510700ede3d86fe8b3c4ee2c2e3995ac720c827 Nov 27 11:20:19 crc kubenswrapper[4807]: I1127 11:20:19.119400 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7577485bdf-dfplg"] Nov 27 11:20:19 crc kubenswrapper[4807]: I1127 11:20:19.124762 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m"] Nov 27 11:20:19 crc kubenswrapper[4807]: W1127 11:20:19.127000 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57fe27be_6097_4ef2_ac4a_2ff1625005a9.slice/crio-8b895dcc148f39de9d762dea35c0ea8d98beb5eecde39e470a46757166db87e8 WatchSource:0}: Error finding container 8b895dcc148f39de9d762dea35c0ea8d98beb5eecde39e470a46757166db87e8: Status 404 returned error can't find the container with id 8b895dcc148f39de9d762dea35c0ea8d98beb5eecde39e470a46757166db87e8 Nov 27 11:20:19 crc kubenswrapper[4807]: W1127 11:20:19.128729 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a57400b_af10_44e2_9c70_8b3d46a0bd6c.slice/crio-f46e90dba1b76773d9a04c51ab2422ec37bd89546ef5ba52a8ed4186e3849ab3 WatchSource:0}: Error finding container f46e90dba1b76773d9a04c51ab2422ec37bd89546ef5ba52a8ed4186e3849ab3: Status 404 returned error can't find the container with id f46e90dba1b76773d9a04c51ab2422ec37bd89546ef5ba52a8ed4186e3849ab3 Nov 27 11:20:19 crc kubenswrapper[4807]: I1127 11:20:19.695293 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7577485bdf-dfplg" event={"ID":"4a57400b-af10-44e2-9c70-8b3d46a0bd6c","Type":"ContainerStarted","Data":"dfb74387d6686456cb15f5a0c3d77c00647b407c3822ee0be5b38c6b31c3e609"} Nov 27 11:20:19 crc kubenswrapper[4807]: I1127 11:20:19.695738 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7577485bdf-dfplg" event={"ID":"4a57400b-af10-44e2-9c70-8b3d46a0bd6c","Type":"ContainerStarted","Data":"f46e90dba1b76773d9a04c51ab2422ec37bd89546ef5ba52a8ed4186e3849ab3"} Nov 27 11:20:19 crc kubenswrapper[4807]: I1127 11:20:19.698140 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6" event={"ID":"18345e30-1bdc-47bf-8e02-16cf6c3f1bb1","Type":"ContainerStarted","Data":"362cfa977d43fee62cb6363e221fb4552f7c1bb12c22846295f4b6cc26067615"} Nov 27 11:20:19 crc kubenswrapper[4807]: I1127 11:20:19.700731 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" event={"ID":"57fe27be-6097-4ef2-ac4a-2ff1625005a9","Type":"ContainerStarted","Data":"8b895dcc148f39de9d762dea35c0ea8d98beb5eecde39e470a46757166db87e8"} Nov 27 11:20:19 crc kubenswrapper[4807]: I1127 11:20:19.702289 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" event={"ID":"902faa80-4625-495f-a0a9-b94bc50eae67","Type":"ContainerStarted","Data":"f79a5538344beb20ccb0a9473510700ede3d86fe8b3c4ee2c2e3995ac720c827"} Nov 27 11:20:19 crc kubenswrapper[4807]: I1127 11:20:19.703526 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sm6qf" event={"ID":"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60","Type":"ContainerStarted","Data":"eb822b810452c1bbeedfff24a06fc4f9534403479f93a8c66e9da8571b5665cb"} Nov 27 11:20:19 crc kubenswrapper[4807]: I1127 11:20:19.727348 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7577485bdf-dfplg" podStartSLOduration=1.7273169990000001 podStartE2EDuration="1.727316999s" podCreationTimestamp="2025-11-27 11:20:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:20:19.718765539 +0000 UTC m=+660.818263797" watchObservedRunningTime="2025-11-27 11:20:19.727316999 +0000 UTC m=+660.826815237" Nov 27 11:20:22 crc kubenswrapper[4807]: I1127 11:20:22.727294 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-sm6qf" event={"ID":"8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60","Type":"ContainerStarted","Data":"2e7ba0b36f32ec6033431e650f85f920ebcddd65f27ffc46bd31f05b038babac"} Nov 27 11:20:22 crc kubenswrapper[4807]: I1127 11:20:22.728050 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:22 crc kubenswrapper[4807]: I1127 11:20:22.729783 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6" event={"ID":"18345e30-1bdc-47bf-8e02-16cf6c3f1bb1","Type":"ContainerStarted","Data":"267e87ebabd636858137c39228c7766bd721649f35834f76ccd699b45533e050"} Nov 27 11:20:22 crc kubenswrapper[4807]: I1127 11:20:22.731171 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" event={"ID":"57fe27be-6097-4ef2-ac4a-2ff1625005a9","Type":"ContainerStarted","Data":"a0d2ae099c09492738d242de02a085b1d5bae37bac2c7d1f4d4ca8156cdb6a86"} Nov 27 11:20:22 crc kubenswrapper[4807]: I1127 11:20:22.731822 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" Nov 27 11:20:22 crc kubenswrapper[4807]: I1127 11:20:22.732786 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" event={"ID":"902faa80-4625-495f-a0a9-b94bc50eae67","Type":"ContainerStarted","Data":"4830176d9f576959d08cadab6a2d0f0c5bcc76e66a37a7d6b488b137c32c35b4"} Nov 27 11:20:22 crc kubenswrapper[4807]: I1127 11:20:22.743459 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-sm6qf" podStartSLOduration=1.092366923 podStartE2EDuration="4.743446453s" podCreationTimestamp="2025-11-27 11:20:18 +0000 UTC" firstStartedPulling="2025-11-27 11:20:18.729924861 +0000 UTC m=+659.829423059" lastFinishedPulling="2025-11-27 11:20:22.381004401 +0000 UTC m=+663.480502589" observedRunningTime="2025-11-27 11:20:22.740958747 +0000 UTC m=+663.840456945" watchObservedRunningTime="2025-11-27 11:20:22.743446453 +0000 UTC m=+663.842944641" Nov 27 11:20:22 crc kubenswrapper[4807]: I1127 11:20:22.761538 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-rvf44" podStartSLOduration=1.392312521 podStartE2EDuration="4.761511457s" podCreationTimestamp="2025-11-27 11:20:18 +0000 UTC" firstStartedPulling="2025-11-27 11:20:18.985059908 +0000 UTC m=+660.084558146" lastFinishedPulling="2025-11-27 11:20:22.354258884 +0000 UTC m=+663.453757082" observedRunningTime="2025-11-27 11:20:22.758186338 +0000 UTC m=+663.857684536" watchObservedRunningTime="2025-11-27 11:20:22.761511457 +0000 UTC m=+663.861009665" Nov 27 11:20:22 crc kubenswrapper[4807]: I1127 11:20:22.803477 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" podStartSLOduration=1.5392733490000001 podStartE2EDuration="4.803458231s" podCreationTimestamp="2025-11-27 11:20:18 +0000 UTC" firstStartedPulling="2025-11-27 11:20:19.129153739 +0000 UTC m=+660.228651947" lastFinishedPulling="2025-11-27 11:20:22.393338631 +0000 UTC m=+663.492836829" observedRunningTime="2025-11-27 11:20:22.801974372 +0000 UTC m=+663.901472590" watchObservedRunningTime="2025-11-27 11:20:22.803458231 +0000 UTC m=+663.902956429" Nov 27 11:20:25 crc kubenswrapper[4807]: I1127 11:20:25.757120 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6" event={"ID":"18345e30-1bdc-47bf-8e02-16cf6c3f1bb1","Type":"ContainerStarted","Data":"56550a35d3e6ebbf4bd98b76cafadcc7ead501e8a95a9fc6517e9b3531a62de2"} Nov 27 11:20:25 crc kubenswrapper[4807]: I1127 11:20:25.778554 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-rcrm6" podStartSLOduration=1.67588599 podStartE2EDuration="7.778535896s" podCreationTimestamp="2025-11-27 11:20:18 +0000 UTC" firstStartedPulling="2025-11-27 11:20:18.873724744 +0000 UTC m=+659.973222942" lastFinishedPulling="2025-11-27 11:20:24.97637464 +0000 UTC m=+666.075872848" observedRunningTime="2025-11-27 11:20:25.775434803 +0000 UTC m=+666.874933011" watchObservedRunningTime="2025-11-27 11:20:25.778535896 +0000 UTC m=+666.878034094" Nov 27 11:20:28 crc kubenswrapper[4807]: I1127 11:20:28.679924 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-sm6qf" Nov 27 11:20:28 crc kubenswrapper[4807]: I1127 11:20:28.952153 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:28 crc kubenswrapper[4807]: I1127 11:20:28.952589 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:28 crc kubenswrapper[4807]: I1127 11:20:28.961315 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:29 crc kubenswrapper[4807]: I1127 11:20:29.795458 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7577485bdf-dfplg" Nov 27 11:20:29 crc kubenswrapper[4807]: I1127 11:20:29.852684 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jdsqc"] Nov 27 11:20:38 crc kubenswrapper[4807]: I1127 11:20:38.636601 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-w7q6m" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.288369 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw"] Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.291066 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.293812 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.297606 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw"] Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.430002 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.430349 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97mzc\" (UniqueName: \"kubernetes.io/projected/f9cf2b1d-920e-4c71-8180-3a944a6b745a-kube-api-access-97mzc\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.430617 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.531875 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.531981 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.532530 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.532574 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97mzc\" (UniqueName: \"kubernetes.io/projected/f9cf2b1d-920e-4c71-8180-3a944a6b745a-kube-api-access-97mzc\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.532703 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.553673 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97mzc\" (UniqueName: \"kubernetes.io/projected/f9cf2b1d-920e-4c71-8180-3a944a6b745a-kube-api-access-97mzc\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.614227 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.833280 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw"] Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.895719 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-jdsqc" podUID="93c49e07-08ef-4b31-abb3-787a46a3fbfd" containerName="console" containerID="cri-o://1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899" gracePeriod=15 Nov 27 11:20:54 crc kubenswrapper[4807]: I1127 11:20:54.956235 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" event={"ID":"f9cf2b1d-920e-4c71-8180-3a944a6b745a","Type":"ContainerStarted","Data":"f1fa48c2fdeeaa59fa85f12c74748904c23bc3efcdd2e231e0b80e3de59be921"} Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.209592 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jdsqc_93c49e07-08ef-4b31-abb3-787a46a3fbfd/console/0.log" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.209663 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.342324 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-oauth-serving-cert\") pod \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.342394 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl6sp\" (UniqueName: \"kubernetes.io/projected/93c49e07-08ef-4b31-abb3-787a46a3fbfd-kube-api-access-gl6sp\") pod \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.342432 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-oauth-config\") pod \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.342481 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-service-ca\") pod \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.342530 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-config\") pod \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.342598 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-serving-cert\") pod \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.343131 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "93c49e07-08ef-4b31-abb3-787a46a3fbfd" (UID: "93c49e07-08ef-4b31-abb3-787a46a3fbfd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.343141 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-config" (OuterVolumeSpecName: "console-config") pod "93c49e07-08ef-4b31-abb3-787a46a3fbfd" (UID: "93c49e07-08ef-4b31-abb3-787a46a3fbfd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.343148 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "93c49e07-08ef-4b31-abb3-787a46a3fbfd" (UID: "93c49e07-08ef-4b31-abb3-787a46a3fbfd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.343167 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-trusted-ca-bundle\") pod \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\" (UID: \"93c49e07-08ef-4b31-abb3-787a46a3fbfd\") " Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.343687 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-service-ca" (OuterVolumeSpecName: "service-ca") pod "93c49e07-08ef-4b31-abb3-787a46a3fbfd" (UID: "93c49e07-08ef-4b31-abb3-787a46a3fbfd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.343701 4807 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.343729 4807 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.343743 4807 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.347603 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "93c49e07-08ef-4b31-abb3-787a46a3fbfd" (UID: "93c49e07-08ef-4b31-abb3-787a46a3fbfd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.347653 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c49e07-08ef-4b31-abb3-787a46a3fbfd-kube-api-access-gl6sp" (OuterVolumeSpecName: "kube-api-access-gl6sp") pod "93c49e07-08ef-4b31-abb3-787a46a3fbfd" (UID: "93c49e07-08ef-4b31-abb3-787a46a3fbfd"). InnerVolumeSpecName "kube-api-access-gl6sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.347795 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "93c49e07-08ef-4b31-abb3-787a46a3fbfd" (UID: "93c49e07-08ef-4b31-abb3-787a46a3fbfd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.444388 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl6sp\" (UniqueName: \"kubernetes.io/projected/93c49e07-08ef-4b31-abb3-787a46a3fbfd-kube-api-access-gl6sp\") on node \"crc\" DevicePath \"\"" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.444430 4807 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.444443 4807 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/93c49e07-08ef-4b31-abb3-787a46a3fbfd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.444454 4807 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/93c49e07-08ef-4b31-abb3-787a46a3fbfd-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.967240 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jdsqc_93c49e07-08ef-4b31-abb3-787a46a3fbfd/console/0.log" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.967359 4807 generic.go:334] "Generic (PLEG): container finished" podID="93c49e07-08ef-4b31-abb3-787a46a3fbfd" containerID="1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899" exitCode=2 Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.967480 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jdsqc" event={"ID":"93c49e07-08ef-4b31-abb3-787a46a3fbfd","Type":"ContainerDied","Data":"1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899"} Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.967502 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jdsqc" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.967518 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jdsqc" event={"ID":"93c49e07-08ef-4b31-abb3-787a46a3fbfd","Type":"ContainerDied","Data":"39a17978965253cb8ff44fcd2baff1b69f2372f7dbd0460d6e9701e781029116"} Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.967546 4807 scope.go:117] "RemoveContainer" containerID="1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.970707 4807 generic.go:334] "Generic (PLEG): container finished" podID="f9cf2b1d-920e-4c71-8180-3a944a6b745a" containerID="35863e5d3c60c62094054e9ed1b1f2d7a7cf05404005fd9d99ed61fb0c00e101" exitCode=0 Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.970759 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" event={"ID":"f9cf2b1d-920e-4c71-8180-3a944a6b745a","Type":"ContainerDied","Data":"35863e5d3c60c62094054e9ed1b1f2d7a7cf05404005fd9d99ed61fb0c00e101"} Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.987616 4807 scope.go:117] "RemoveContainer" containerID="1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899" Nov 27 11:20:55 crc kubenswrapper[4807]: E1127 11:20:55.988001 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899\": container with ID starting with 1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899 not found: ID does not exist" containerID="1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899" Nov 27 11:20:55 crc kubenswrapper[4807]: I1127 11:20:55.988064 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899"} err="failed to get container status \"1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899\": rpc error: code = NotFound desc = could not find container \"1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899\": container with ID starting with 1507db743a51e61ef8a4e33b3bdaea25a7f492578649121d9f24d7e6a2094899 not found: ID does not exist" Nov 27 11:20:56 crc kubenswrapper[4807]: I1127 11:20:56.016319 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jdsqc"] Nov 27 11:20:56 crc kubenswrapper[4807]: I1127 11:20:56.026300 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-jdsqc"] Nov 27 11:20:57 crc kubenswrapper[4807]: I1127 11:20:57.544340 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93c49e07-08ef-4b31-abb3-787a46a3fbfd" path="/var/lib/kubelet/pods/93c49e07-08ef-4b31-abb3-787a46a3fbfd/volumes" Nov 27 11:20:57 crc kubenswrapper[4807]: I1127 11:20:57.988095 4807 generic.go:334] "Generic (PLEG): container finished" podID="f9cf2b1d-920e-4c71-8180-3a944a6b745a" containerID="a4593ca99b696f21e7c6c629721f285d490a0933336511fe563c85f628b8a437" exitCode=0 Nov 27 11:20:57 crc kubenswrapper[4807]: I1127 11:20:57.988133 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" event={"ID":"f9cf2b1d-920e-4c71-8180-3a944a6b745a","Type":"ContainerDied","Data":"a4593ca99b696f21e7c6c629721f285d490a0933336511fe563c85f628b8a437"} Nov 27 11:20:59 crc kubenswrapper[4807]: I1127 11:20:59.002508 4807 generic.go:334] "Generic (PLEG): container finished" podID="f9cf2b1d-920e-4c71-8180-3a944a6b745a" containerID="0c715ab394213863084277cc898fc152b52ef4426a153dbc2ef2cfb2cfbe727d" exitCode=0 Nov 27 11:20:59 crc kubenswrapper[4807]: I1127 11:20:59.002585 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" event={"ID":"f9cf2b1d-920e-4c71-8180-3a944a6b745a","Type":"ContainerDied","Data":"0c715ab394213863084277cc898fc152b52ef4426a153dbc2ef2cfb2cfbe727d"} Nov 27 11:21:00 crc kubenswrapper[4807]: I1127 11:21:00.367792 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:21:00 crc kubenswrapper[4807]: I1127 11:21:00.417257 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-bundle\") pod \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " Nov 27 11:21:00 crc kubenswrapper[4807]: I1127 11:21:00.417338 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97mzc\" (UniqueName: \"kubernetes.io/projected/f9cf2b1d-920e-4c71-8180-3a944a6b745a-kube-api-access-97mzc\") pod \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " Nov 27 11:21:00 crc kubenswrapper[4807]: I1127 11:21:00.417387 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-util\") pod \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\" (UID: \"f9cf2b1d-920e-4c71-8180-3a944a6b745a\") " Nov 27 11:21:00 crc kubenswrapper[4807]: I1127 11:21:00.433825 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-bundle" (OuterVolumeSpecName: "bundle") pod "f9cf2b1d-920e-4c71-8180-3a944a6b745a" (UID: "f9cf2b1d-920e-4c71-8180-3a944a6b745a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:21:00 crc kubenswrapper[4807]: I1127 11:21:00.447536 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9cf2b1d-920e-4c71-8180-3a944a6b745a-kube-api-access-97mzc" (OuterVolumeSpecName: "kube-api-access-97mzc") pod "f9cf2b1d-920e-4c71-8180-3a944a6b745a" (UID: "f9cf2b1d-920e-4c71-8180-3a944a6b745a"). InnerVolumeSpecName "kube-api-access-97mzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:21:00 crc kubenswrapper[4807]: I1127 11:21:00.518998 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97mzc\" (UniqueName: \"kubernetes.io/projected/f9cf2b1d-920e-4c71-8180-3a944a6b745a-kube-api-access-97mzc\") on node \"crc\" DevicePath \"\"" Nov 27 11:21:00 crc kubenswrapper[4807]: I1127 11:21:00.519034 4807 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:21:00 crc kubenswrapper[4807]: I1127 11:21:00.611591 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-util" (OuterVolumeSpecName: "util") pod "f9cf2b1d-920e-4c71-8180-3a944a6b745a" (UID: "f9cf2b1d-920e-4c71-8180-3a944a6b745a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:21:00 crc kubenswrapper[4807]: I1127 11:21:00.619770 4807 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9cf2b1d-920e-4c71-8180-3a944a6b745a-util\") on node \"crc\" DevicePath \"\"" Nov 27 11:21:01 crc kubenswrapper[4807]: I1127 11:21:01.020622 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" event={"ID":"f9cf2b1d-920e-4c71-8180-3a944a6b745a","Type":"ContainerDied","Data":"f1fa48c2fdeeaa59fa85f12c74748904c23bc3efcdd2e231e0b80e3de59be921"} Nov 27 11:21:01 crc kubenswrapper[4807]: I1127 11:21:01.020885 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw" Nov 27 11:21:01 crc kubenswrapper[4807]: I1127 11:21:01.021293 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1fa48c2fdeeaa59fa85f12c74748904c23bc3efcdd2e231e0b80e3de59be921" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.601137 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn"] Nov 27 11:21:09 crc kubenswrapper[4807]: E1127 11:21:09.601875 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c49e07-08ef-4b31-abb3-787a46a3fbfd" containerName="console" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.601889 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c49e07-08ef-4b31-abb3-787a46a3fbfd" containerName="console" Nov 27 11:21:09 crc kubenswrapper[4807]: E1127 11:21:09.601901 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cf2b1d-920e-4c71-8180-3a944a6b745a" containerName="extract" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.601906 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cf2b1d-920e-4c71-8180-3a944a6b745a" containerName="extract" Nov 27 11:21:09 crc kubenswrapper[4807]: E1127 11:21:09.601921 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cf2b1d-920e-4c71-8180-3a944a6b745a" containerName="util" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.601926 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cf2b1d-920e-4c71-8180-3a944a6b745a" containerName="util" Nov 27 11:21:09 crc kubenswrapper[4807]: E1127 11:21:09.601939 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cf2b1d-920e-4c71-8180-3a944a6b745a" containerName="pull" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.601945 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cf2b1d-920e-4c71-8180-3a944a6b745a" containerName="pull" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.602038 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c49e07-08ef-4b31-abb3-787a46a3fbfd" containerName="console" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.602046 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9cf2b1d-920e-4c71-8180-3a944a6b745a" containerName="extract" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.602447 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.604592 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.604622 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.605687 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.607101 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-466c2" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.608364 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.626869 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn"] Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.728116 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd79b824-f426-4793-b5e4-b351642047f5-webhook-cert\") pod \"metallb-operator-controller-manager-7dd7bd5d6c-lvbpn\" (UID: \"fd79b824-f426-4793-b5e4-b351642047f5\") " pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.728180 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gw2x\" (UniqueName: \"kubernetes.io/projected/fd79b824-f426-4793-b5e4-b351642047f5-kube-api-access-2gw2x\") pod \"metallb-operator-controller-manager-7dd7bd5d6c-lvbpn\" (UID: \"fd79b824-f426-4793-b5e4-b351642047f5\") " pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.728273 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd79b824-f426-4793-b5e4-b351642047f5-apiservice-cert\") pod \"metallb-operator-controller-manager-7dd7bd5d6c-lvbpn\" (UID: \"fd79b824-f426-4793-b5e4-b351642047f5\") " pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.829515 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd79b824-f426-4793-b5e4-b351642047f5-webhook-cert\") pod \"metallb-operator-controller-manager-7dd7bd5d6c-lvbpn\" (UID: \"fd79b824-f426-4793-b5e4-b351642047f5\") " pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.829924 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gw2x\" (UniqueName: \"kubernetes.io/projected/fd79b824-f426-4793-b5e4-b351642047f5-kube-api-access-2gw2x\") pod \"metallb-operator-controller-manager-7dd7bd5d6c-lvbpn\" (UID: \"fd79b824-f426-4793-b5e4-b351642047f5\") " pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.830030 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd79b824-f426-4793-b5e4-b351642047f5-apiservice-cert\") pod \"metallb-operator-controller-manager-7dd7bd5d6c-lvbpn\" (UID: \"fd79b824-f426-4793-b5e4-b351642047f5\") " pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.835747 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd79b824-f426-4793-b5e4-b351642047f5-webhook-cert\") pod \"metallb-operator-controller-manager-7dd7bd5d6c-lvbpn\" (UID: \"fd79b824-f426-4793-b5e4-b351642047f5\") " pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.836445 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd79b824-f426-4793-b5e4-b351642047f5-apiservice-cert\") pod \"metallb-operator-controller-manager-7dd7bd5d6c-lvbpn\" (UID: \"fd79b824-f426-4793-b5e4-b351642047f5\") " pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.850455 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gw2x\" (UniqueName: \"kubernetes.io/projected/fd79b824-f426-4793-b5e4-b351642047f5-kube-api-access-2gw2x\") pod \"metallb-operator-controller-manager-7dd7bd5d6c-lvbpn\" (UID: \"fd79b824-f426-4793-b5e4-b351642047f5\") " pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.918350 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.949267 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q"] Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.949909 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.957389 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.957876 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.958049 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bwnw2" Nov 27 11:21:09 crc kubenswrapper[4807]: I1127 11:21:09.967778 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q"] Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.133425 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfbac2a4-3c47-4ac8-8643-c322886121d4-webhook-cert\") pod \"metallb-operator-webhook-server-64bbfd4bf8-wlg8q\" (UID: \"dfbac2a4-3c47-4ac8-8643-c322886121d4\") " pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.133721 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfbac2a4-3c47-4ac8-8643-c322886121d4-apiservice-cert\") pod \"metallb-operator-webhook-server-64bbfd4bf8-wlg8q\" (UID: \"dfbac2a4-3c47-4ac8-8643-c322886121d4\") " pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.133755 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdslm\" (UniqueName: \"kubernetes.io/projected/dfbac2a4-3c47-4ac8-8643-c322886121d4-kube-api-access-bdslm\") pod \"metallb-operator-webhook-server-64bbfd4bf8-wlg8q\" (UID: \"dfbac2a4-3c47-4ac8-8643-c322886121d4\") " pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.235371 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfbac2a4-3c47-4ac8-8643-c322886121d4-apiservice-cert\") pod \"metallb-operator-webhook-server-64bbfd4bf8-wlg8q\" (UID: \"dfbac2a4-3c47-4ac8-8643-c322886121d4\") " pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.235427 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdslm\" (UniqueName: \"kubernetes.io/projected/dfbac2a4-3c47-4ac8-8643-c322886121d4-kube-api-access-bdslm\") pod \"metallb-operator-webhook-server-64bbfd4bf8-wlg8q\" (UID: \"dfbac2a4-3c47-4ac8-8643-c322886121d4\") " pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.235463 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfbac2a4-3c47-4ac8-8643-c322886121d4-webhook-cert\") pod \"metallb-operator-webhook-server-64bbfd4bf8-wlg8q\" (UID: \"dfbac2a4-3c47-4ac8-8643-c322886121d4\") " pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.238575 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfbac2a4-3c47-4ac8-8643-c322886121d4-apiservice-cert\") pod \"metallb-operator-webhook-server-64bbfd4bf8-wlg8q\" (UID: \"dfbac2a4-3c47-4ac8-8643-c322886121d4\") " pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.243884 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfbac2a4-3c47-4ac8-8643-c322886121d4-webhook-cert\") pod \"metallb-operator-webhook-server-64bbfd4bf8-wlg8q\" (UID: \"dfbac2a4-3c47-4ac8-8643-c322886121d4\") " pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.249701 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdslm\" (UniqueName: \"kubernetes.io/projected/dfbac2a4-3c47-4ac8-8643-c322886121d4-kube-api-access-bdslm\") pod \"metallb-operator-webhook-server-64bbfd4bf8-wlg8q\" (UID: \"dfbac2a4-3c47-4ac8-8643-c322886121d4\") " pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.304616 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.439049 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn"] Nov 27 11:21:10 crc kubenswrapper[4807]: I1127 11:21:10.530029 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q"] Nov 27 11:21:10 crc kubenswrapper[4807]: W1127 11:21:10.542358 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfbac2a4_3c47_4ac8_8643_c322886121d4.slice/crio-7196cfb16f16da70c2f53f96708ff729e079eca06f60b45fa2e51f32dc66a00b WatchSource:0}: Error finding container 7196cfb16f16da70c2f53f96708ff729e079eca06f60b45fa2e51f32dc66a00b: Status 404 returned error can't find the container with id 7196cfb16f16da70c2f53f96708ff729e079eca06f60b45fa2e51f32dc66a00b Nov 27 11:21:11 crc kubenswrapper[4807]: I1127 11:21:11.072152 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" event={"ID":"fd79b824-f426-4793-b5e4-b351642047f5","Type":"ContainerStarted","Data":"5ef2451614f3bb1a3214c0a58a345f9efe5f980c2f5696db7807ffc26c67820a"} Nov 27 11:21:11 crc kubenswrapper[4807]: I1127 11:21:11.073531 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" event={"ID":"dfbac2a4-3c47-4ac8-8643-c322886121d4","Type":"ContainerStarted","Data":"7196cfb16f16da70c2f53f96708ff729e079eca06f60b45fa2e51f32dc66a00b"} Nov 27 11:21:14 crc kubenswrapper[4807]: I1127 11:21:14.094951 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" event={"ID":"fd79b824-f426-4793-b5e4-b351642047f5","Type":"ContainerStarted","Data":"c978c6a10926da83e87ac71eacdbf58d16bca07960bab64641a6399afe1c4699"} Nov 27 11:21:14 crc kubenswrapper[4807]: I1127 11:21:14.096400 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:14 crc kubenswrapper[4807]: I1127 11:21:14.120309 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" podStartSLOduration=2.3377050329999998 podStartE2EDuration="5.120294456s" podCreationTimestamp="2025-11-27 11:21:09 +0000 UTC" firstStartedPulling="2025-11-27 11:21:10.463106299 +0000 UTC m=+711.562604497" lastFinishedPulling="2025-11-27 11:21:13.245695722 +0000 UTC m=+714.345193920" observedRunningTime="2025-11-27 11:21:14.117563433 +0000 UTC m=+715.217061651" watchObservedRunningTime="2025-11-27 11:21:14.120294456 +0000 UTC m=+715.219792654" Nov 27 11:21:16 crc kubenswrapper[4807]: I1127 11:21:16.109848 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" event={"ID":"dfbac2a4-3c47-4ac8-8643-c322886121d4","Type":"ContainerStarted","Data":"4d1e96781e73ca0bff51179ed6cb7f2f19487212c98701f59e60e811cb717113"} Nov 27 11:21:16 crc kubenswrapper[4807]: I1127 11:21:16.136227 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" podStartSLOduration=2.684379293 podStartE2EDuration="7.136212284s" podCreationTimestamp="2025-11-27 11:21:09 +0000 UTC" firstStartedPulling="2025-11-27 11:21:10.545675691 +0000 UTC m=+711.645173889" lastFinishedPulling="2025-11-27 11:21:14.997508682 +0000 UTC m=+716.097006880" observedRunningTime="2025-11-27 11:21:16.133917933 +0000 UTC m=+717.233416161" watchObservedRunningTime="2025-11-27 11:21:16.136212284 +0000 UTC m=+717.235710482" Nov 27 11:21:17 crc kubenswrapper[4807]: I1127 11:21:17.115336 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:30 crc kubenswrapper[4807]: I1127 11:21:30.312687 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-64bbfd4bf8-wlg8q" Nov 27 11:21:49 crc kubenswrapper[4807]: I1127 11:21:49.920476 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7dd7bd5d6c-lvbpn" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.546829 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5rlln"] Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.549430 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.551937 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.552014 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.553909 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-n27q9" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.560127 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8"] Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.560969 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.565518 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.570098 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f0dfbf-ad37-403d-968c-852dec2e09a0-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-xksm8\" (UID: \"43f0dfbf-ad37-403d-968c-852dec2e09a0\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.570167 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q82n8\" (UniqueName: \"kubernetes.io/projected/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-kube-api-access-q82n8\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.570282 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-metrics-certs\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.570333 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-metrics\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.570367 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-frr-conf\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.570388 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-reloader\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.570418 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-frr-startup\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.570518 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t52q8\" (UniqueName: \"kubernetes.io/projected/43f0dfbf-ad37-403d-968c-852dec2e09a0-kube-api-access-t52q8\") pod \"frr-k8s-webhook-server-7fcb986d4-xksm8\" (UID: \"43f0dfbf-ad37-403d-968c-852dec2e09a0\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.570597 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-frr-sockets\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.584199 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8"] Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.653513 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-chfn8"] Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.654391 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.657162 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.657415 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.657533 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.658617 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-pkvz6"] Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.659420 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-vzh28" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.659470 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.661764 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671459 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa209ab-1103-43a9-88e3-b7dd7048b2a6-metrics-certs\") pod \"controller-f8648f98b-pkvz6\" (UID: \"cfa209ab-1103-43a9-88e3-b7dd7048b2a6\") " pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671494 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-frr-sockets\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671519 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f0dfbf-ad37-403d-968c-852dec2e09a0-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-xksm8\" (UID: \"43f0dfbf-ad37-403d-968c-852dec2e09a0\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671543 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lknr5\" (UniqueName: \"kubernetes.io/projected/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-kube-api-access-lknr5\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671559 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-metallb-excludel2\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671578 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-memberlist\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671597 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q82n8\" (UniqueName: \"kubernetes.io/projected/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-kube-api-access-q82n8\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671613 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-metrics-certs\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671628 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-metrics-certs\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671642 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-metrics\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671656 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-reloader\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671668 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-frr-conf\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671683 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-frr-startup\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671705 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t52q8\" (UniqueName: \"kubernetes.io/projected/43f0dfbf-ad37-403d-968c-852dec2e09a0-kube-api-access-t52q8\") pod \"frr-k8s-webhook-server-7fcb986d4-xksm8\" (UID: \"43f0dfbf-ad37-403d-968c-852dec2e09a0\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671721 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa209ab-1103-43a9-88e3-b7dd7048b2a6-cert\") pod \"controller-f8648f98b-pkvz6\" (UID: \"cfa209ab-1103-43a9-88e3-b7dd7048b2a6\") " pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.671754 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g4d2\" (UniqueName: \"kubernetes.io/projected/cfa209ab-1103-43a9-88e3-b7dd7048b2a6-kube-api-access-2g4d2\") pod \"controller-f8648f98b-pkvz6\" (UID: \"cfa209ab-1103-43a9-88e3-b7dd7048b2a6\") " pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.672109 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-frr-sockets\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: E1127 11:21:50.672178 4807 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Nov 27 11:21:50 crc kubenswrapper[4807]: E1127 11:21:50.672214 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f0dfbf-ad37-403d-968c-852dec2e09a0-cert podName:43f0dfbf-ad37-403d-968c-852dec2e09a0 nodeName:}" failed. No retries permitted until 2025-11-27 11:21:51.172201164 +0000 UTC m=+752.271699362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43f0dfbf-ad37-403d-968c-852dec2e09a0-cert") pod "frr-k8s-webhook-server-7fcb986d4-xksm8" (UID: "43f0dfbf-ad37-403d-968c-852dec2e09a0") : secret "frr-k8s-webhook-server-cert" not found Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.673501 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-metrics\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.673528 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-reloader\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.673620 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-frr-conf\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.674213 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-frr-startup\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.679701 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-metrics-certs\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.686324 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-pkvz6"] Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.709288 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t52q8\" (UniqueName: \"kubernetes.io/projected/43f0dfbf-ad37-403d-968c-852dec2e09a0-kube-api-access-t52q8\") pod \"frr-k8s-webhook-server-7fcb986d4-xksm8\" (UID: \"43f0dfbf-ad37-403d-968c-852dec2e09a0\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.720977 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q82n8\" (UniqueName: \"kubernetes.io/projected/0ddec633-d788-4b9f-afe6-c059e3c7f2e5-kube-api-access-q82n8\") pod \"frr-k8s-5rlln\" (UID: \"0ddec633-d788-4b9f-afe6-c059e3c7f2e5\") " pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.772554 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa209ab-1103-43a9-88e3-b7dd7048b2a6-cert\") pod \"controller-f8648f98b-pkvz6\" (UID: \"cfa209ab-1103-43a9-88e3-b7dd7048b2a6\") " pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.772798 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g4d2\" (UniqueName: \"kubernetes.io/projected/cfa209ab-1103-43a9-88e3-b7dd7048b2a6-kube-api-access-2g4d2\") pod \"controller-f8648f98b-pkvz6\" (UID: \"cfa209ab-1103-43a9-88e3-b7dd7048b2a6\") " pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.772872 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa209ab-1103-43a9-88e3-b7dd7048b2a6-metrics-certs\") pod \"controller-f8648f98b-pkvz6\" (UID: \"cfa209ab-1103-43a9-88e3-b7dd7048b2a6\") " pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.772967 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lknr5\" (UniqueName: \"kubernetes.io/projected/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-kube-api-access-lknr5\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.773045 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-metallb-excludel2\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.773119 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-memberlist\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.773198 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-metrics-certs\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: E1127 11:21:50.773969 4807 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.774044 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-metallb-excludel2\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: E1127 11:21:50.774175 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-memberlist podName:6bdffc3f-68a4-4eb0-a4a7-725db327ea08 nodeName:}" failed. No retries permitted until 2025-11-27 11:21:51.274101073 +0000 UTC m=+752.373599271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-memberlist") pod "speaker-chfn8" (UID: "6bdffc3f-68a4-4eb0-a4a7-725db327ea08") : secret "metallb-memberlist" not found Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.774222 4807 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.775831 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cfa209ab-1103-43a9-88e3-b7dd7048b2a6-metrics-certs\") pod \"controller-f8648f98b-pkvz6\" (UID: \"cfa209ab-1103-43a9-88e3-b7dd7048b2a6\") " pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.776630 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-metrics-certs\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.786293 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfa209ab-1103-43a9-88e3-b7dd7048b2a6-cert\") pod \"controller-f8648f98b-pkvz6\" (UID: \"cfa209ab-1103-43a9-88e3-b7dd7048b2a6\") " pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.787789 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g4d2\" (UniqueName: \"kubernetes.io/projected/cfa209ab-1103-43a9-88e3-b7dd7048b2a6-kube-api-access-2g4d2\") pod \"controller-f8648f98b-pkvz6\" (UID: \"cfa209ab-1103-43a9-88e3-b7dd7048b2a6\") " pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.799343 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lknr5\" (UniqueName: \"kubernetes.io/projected/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-kube-api-access-lknr5\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:50 crc kubenswrapper[4807]: I1127 11:21:50.866649 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5rlln" Nov 27 11:21:51 crc kubenswrapper[4807]: I1127 11:21:51.007990 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:51 crc kubenswrapper[4807]: I1127 11:21:51.179377 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f0dfbf-ad37-403d-968c-852dec2e09a0-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-xksm8\" (UID: \"43f0dfbf-ad37-403d-968c-852dec2e09a0\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:51 crc kubenswrapper[4807]: E1127 11:21:51.179994 4807 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Nov 27 11:21:51 crc kubenswrapper[4807]: E1127 11:21:51.180057 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43f0dfbf-ad37-403d-968c-852dec2e09a0-cert podName:43f0dfbf-ad37-403d-968c-852dec2e09a0 nodeName:}" failed. No retries permitted until 2025-11-27 11:21:52.180041424 +0000 UTC m=+753.279539622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43f0dfbf-ad37-403d-968c-852dec2e09a0-cert") pod "frr-k8s-webhook-server-7fcb986d4-xksm8" (UID: "43f0dfbf-ad37-403d-968c-852dec2e09a0") : secret "frr-k8s-webhook-server-cert" not found Nov 27 11:21:51 crc kubenswrapper[4807]: I1127 11:21:51.281682 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-memberlist\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:51 crc kubenswrapper[4807]: E1127 11:21:51.281882 4807 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 27 11:21:51 crc kubenswrapper[4807]: E1127 11:21:51.281963 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-memberlist podName:6bdffc3f-68a4-4eb0-a4a7-725db327ea08 nodeName:}" failed. No retries permitted until 2025-11-27 11:21:52.281943723 +0000 UTC m=+753.381441921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-memberlist") pod "speaker-chfn8" (UID: "6bdffc3f-68a4-4eb0-a4a7-725db327ea08") : secret "metallb-memberlist" not found Nov 27 11:21:51 crc kubenswrapper[4807]: I1127 11:21:51.331022 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5rlln" event={"ID":"0ddec633-d788-4b9f-afe6-c059e3c7f2e5","Type":"ContainerStarted","Data":"9cd1ac1b49c656c2b7bfff9e64b6d11f082aa0d0be3af1e37920f400b8e6b453"} Nov 27 11:21:51 crc kubenswrapper[4807]: I1127 11:21:51.419449 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-pkvz6"] Nov 27 11:21:51 crc kubenswrapper[4807]: W1127 11:21:51.425737 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa209ab_1103_43a9_88e3_b7dd7048b2a6.slice/crio-60ef984d5a05e4e6da287cea8e493fc5769cfa09b8df50694de391a914e42f47 WatchSource:0}: Error finding container 60ef984d5a05e4e6da287cea8e493fc5769cfa09b8df50694de391a914e42f47: Status 404 returned error can't find the container with id 60ef984d5a05e4e6da287cea8e493fc5769cfa09b8df50694de391a914e42f47 Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.199326 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f0dfbf-ad37-403d-968c-852dec2e09a0-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-xksm8\" (UID: \"43f0dfbf-ad37-403d-968c-852dec2e09a0\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.206603 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43f0dfbf-ad37-403d-968c-852dec2e09a0-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-xksm8\" (UID: \"43f0dfbf-ad37-403d-968c-852dec2e09a0\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.300571 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-memberlist\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.303423 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6bdffc3f-68a4-4eb0-a4a7-725db327ea08-memberlist\") pod \"speaker-chfn8\" (UID: \"6bdffc3f-68a4-4eb0-a4a7-725db327ea08\") " pod="metallb-system/speaker-chfn8" Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.338950 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-pkvz6" event={"ID":"cfa209ab-1103-43a9-88e3-b7dd7048b2a6","Type":"ContainerStarted","Data":"1cd29de5eb32519b3f24dd36a0312679315f0f33a7b59074063fffe1a49c22a6"} Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.339223 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-pkvz6" event={"ID":"cfa209ab-1103-43a9-88e3-b7dd7048b2a6","Type":"ContainerStarted","Data":"1c1d71075106e4c55fea3940fb29e4b44cc30bd81a60e017d8423ea6a590eb46"} Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.339439 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.339481 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-pkvz6" event={"ID":"cfa209ab-1103-43a9-88e3-b7dd7048b2a6","Type":"ContainerStarted","Data":"60ef984d5a05e4e6da287cea8e493fc5769cfa09b8df50694de391a914e42f47"} Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.359766 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-pkvz6" podStartSLOduration=2.35974977 podStartE2EDuration="2.35974977s" podCreationTimestamp="2025-11-27 11:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:21:52.357642254 +0000 UTC m=+753.457140452" watchObservedRunningTime="2025-11-27 11:21:52.35974977 +0000 UTC m=+753.459247968" Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.386843 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.469761 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-chfn8" Nov 27 11:21:52 crc kubenswrapper[4807]: W1127 11:21:52.490231 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bdffc3f_68a4_4eb0_a4a7_725db327ea08.slice/crio-2aba7e9bc2bc0628913d43b7fc1731c3a14ebff477258c4258c044897b2e4fba WatchSource:0}: Error finding container 2aba7e9bc2bc0628913d43b7fc1731c3a14ebff477258c4258c044897b2e4fba: Status 404 returned error can't find the container with id 2aba7e9bc2bc0628913d43b7fc1731c3a14ebff477258c4258c044897b2e4fba Nov 27 11:21:52 crc kubenswrapper[4807]: I1127 11:21:52.787301 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8"] Nov 27 11:21:52 crc kubenswrapper[4807]: W1127 11:21:52.810666 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43f0dfbf_ad37_403d_968c_852dec2e09a0.slice/crio-92fe1003b5eba0037ccb62082ad57d85e25756b943e0c6d52db1ee2c4ef356e1 WatchSource:0}: Error finding container 92fe1003b5eba0037ccb62082ad57d85e25756b943e0c6d52db1ee2c4ef356e1: Status 404 returned error can't find the container with id 92fe1003b5eba0037ccb62082ad57d85e25756b943e0c6d52db1ee2c4ef356e1 Nov 27 11:21:53 crc kubenswrapper[4807]: I1127 11:21:53.348684 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" event={"ID":"43f0dfbf-ad37-403d-968c-852dec2e09a0","Type":"ContainerStarted","Data":"92fe1003b5eba0037ccb62082ad57d85e25756b943e0c6d52db1ee2c4ef356e1"} Nov 27 11:21:53 crc kubenswrapper[4807]: I1127 11:21:53.355780 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-chfn8" event={"ID":"6bdffc3f-68a4-4eb0-a4a7-725db327ea08","Type":"ContainerStarted","Data":"086f948e6184e9d7344bb6ede79efcc201e1d3931fcdd910fddd3a556010c2bc"} Nov 27 11:21:53 crc kubenswrapper[4807]: I1127 11:21:53.355849 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-chfn8" event={"ID":"6bdffc3f-68a4-4eb0-a4a7-725db327ea08","Type":"ContainerStarted","Data":"43958a5dcce11f82247a3500f4a836ca428599dfb211c4bf2814dcdf9a5bc467"} Nov 27 11:21:53 crc kubenswrapper[4807]: I1127 11:21:53.355867 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-chfn8" event={"ID":"6bdffc3f-68a4-4eb0-a4a7-725db327ea08","Type":"ContainerStarted","Data":"2aba7e9bc2bc0628913d43b7fc1731c3a14ebff477258c4258c044897b2e4fba"} Nov 27 11:21:53 crc kubenswrapper[4807]: I1127 11:21:53.356058 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-chfn8" Nov 27 11:21:53 crc kubenswrapper[4807]: I1127 11:21:53.383530 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-chfn8" podStartSLOduration=3.383511885 podStartE2EDuration="3.383511885s" podCreationTimestamp="2025-11-27 11:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:21:53.38034827 +0000 UTC m=+754.479846468" watchObservedRunningTime="2025-11-27 11:21:53.383511885 +0000 UTC m=+754.483010083" Nov 27 11:21:54 crc kubenswrapper[4807]: I1127 11:21:54.203859 4807 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 27 11:21:58 crc kubenswrapper[4807]: I1127 11:21:58.387738 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" event={"ID":"43f0dfbf-ad37-403d-968c-852dec2e09a0","Type":"ContainerStarted","Data":"6fb92c872b50cf10aa27fc7bb6f67ba6721c896f738452b71e4c16adccdec69b"} Nov 27 11:21:58 crc kubenswrapper[4807]: I1127 11:21:58.388242 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:21:58 crc kubenswrapper[4807]: I1127 11:21:58.393370 4807 generic.go:334] "Generic (PLEG): container finished" podID="0ddec633-d788-4b9f-afe6-c059e3c7f2e5" containerID="763fef96eb6cb31857b4b067963b59927c43765fc5e33a169d30672904f39057" exitCode=0 Nov 27 11:21:58 crc kubenswrapper[4807]: I1127 11:21:58.393400 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5rlln" event={"ID":"0ddec633-d788-4b9f-afe6-c059e3c7f2e5","Type":"ContainerDied","Data":"763fef96eb6cb31857b4b067963b59927c43765fc5e33a169d30672904f39057"} Nov 27 11:21:58 crc kubenswrapper[4807]: I1127 11:21:58.412876 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" podStartSLOduration=3.265594509 podStartE2EDuration="8.412858964s" podCreationTimestamp="2025-11-27 11:21:50 +0000 UTC" firstStartedPulling="2025-11-27 11:21:52.813366633 +0000 UTC m=+753.912864831" lastFinishedPulling="2025-11-27 11:21:57.960631088 +0000 UTC m=+759.060129286" observedRunningTime="2025-11-27 11:21:58.409100484 +0000 UTC m=+759.508598682" watchObservedRunningTime="2025-11-27 11:21:58.412858964 +0000 UTC m=+759.512357162" Nov 27 11:21:59 crc kubenswrapper[4807]: I1127 11:21:59.401404 4807 generic.go:334] "Generic (PLEG): container finished" podID="0ddec633-d788-4b9f-afe6-c059e3c7f2e5" containerID="f4ae4abfe1d5b562f4b8a9333b1d5dfd21e09ff30cdca3dabab637542f8a7f01" exitCode=0 Nov 27 11:21:59 crc kubenswrapper[4807]: I1127 11:21:59.401451 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5rlln" event={"ID":"0ddec633-d788-4b9f-afe6-c059e3c7f2e5","Type":"ContainerDied","Data":"f4ae4abfe1d5b562f4b8a9333b1d5dfd21e09ff30cdca3dabab637542f8a7f01"} Nov 27 11:22:00 crc kubenswrapper[4807]: I1127 11:22:00.413828 4807 generic.go:334] "Generic (PLEG): container finished" podID="0ddec633-d788-4b9f-afe6-c059e3c7f2e5" containerID="640c6b2f6b4709f6a7c7b5f667f428dbb7d5f65a1abce3b322827f9009fe0479" exitCode=0 Nov 27 11:22:00 crc kubenswrapper[4807]: I1127 11:22:00.413916 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5rlln" event={"ID":"0ddec633-d788-4b9f-afe6-c059e3c7f2e5","Type":"ContainerDied","Data":"640c6b2f6b4709f6a7c7b5f667f428dbb7d5f65a1abce3b322827f9009fe0479"} Nov 27 11:22:01 crc kubenswrapper[4807]: I1127 11:22:01.013030 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-pkvz6" Nov 27 11:22:01 crc kubenswrapper[4807]: I1127 11:22:01.422937 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5rlln" event={"ID":"0ddec633-d788-4b9f-afe6-c059e3c7f2e5","Type":"ContainerStarted","Data":"2e5121975663860d0cab40773266db40fa7f11e5b1eea92059d2d0faa42a384e"} Nov 27 11:22:01 crc kubenswrapper[4807]: I1127 11:22:01.422973 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5rlln" event={"ID":"0ddec633-d788-4b9f-afe6-c059e3c7f2e5","Type":"ContainerStarted","Data":"c5c32bd5645396922d75e204c9fea522b5640502f3927980629409c95862f6f9"} Nov 27 11:22:01 crc kubenswrapper[4807]: I1127 11:22:01.422983 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5rlln" event={"ID":"0ddec633-d788-4b9f-afe6-c059e3c7f2e5","Type":"ContainerStarted","Data":"dc0b1c0643d15efde24fcf9e20da8219b93c25da9b4e48cb2ae41895a8b9b25f"} Nov 27 11:22:01 crc kubenswrapper[4807]: I1127 11:22:01.422991 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5rlln" event={"ID":"0ddec633-d788-4b9f-afe6-c059e3c7f2e5","Type":"ContainerStarted","Data":"d3d4a5a3bfcc0f9c3f8cdb6c3a33381d9fc7f55bd4efac211781e3f6c5c3407c"} Nov 27 11:22:01 crc kubenswrapper[4807]: I1127 11:22:01.423009 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5rlln" event={"ID":"0ddec633-d788-4b9f-afe6-c059e3c7f2e5","Type":"ContainerStarted","Data":"5f5c32942193ac245310b23173aa12190bd0c7bad5b6a583d50e1754ccffcd19"} Nov 27 11:22:01 crc kubenswrapper[4807]: I1127 11:22:01.423018 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5rlln" event={"ID":"0ddec633-d788-4b9f-afe6-c059e3c7f2e5","Type":"ContainerStarted","Data":"820b815c19f0f42e42a88c55a41b7a23b5fca411e3d907e6d6dd7ae3e926910e"} Nov 27 11:22:01 crc kubenswrapper[4807]: I1127 11:22:01.423108 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5rlln" Nov 27 11:22:01 crc kubenswrapper[4807]: I1127 11:22:01.448462 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5rlln" podStartSLOduration=4.492967327 podStartE2EDuration="11.448447647s" podCreationTimestamp="2025-11-27 11:21:50 +0000 UTC" firstStartedPulling="2025-11-27 11:21:50.971967212 +0000 UTC m=+752.071465410" lastFinishedPulling="2025-11-27 11:21:57.927447532 +0000 UTC m=+759.026945730" observedRunningTime="2025-11-27 11:22:01.446128585 +0000 UTC m=+762.545626783" watchObservedRunningTime="2025-11-27 11:22:01.448447647 +0000 UTC m=+762.547945845" Nov 27 11:22:02 crc kubenswrapper[4807]: I1127 11:22:02.472902 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-chfn8" Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.221026 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hpd82"] Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.221977 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hpd82" Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.224476 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.224561 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-5vtsn" Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.225055 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.280338 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hpd82"] Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.364667 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzvxf\" (UniqueName: \"kubernetes.io/projected/56e2bde0-6151-4863-a4f0-c74ed1454c00-kube-api-access-wzvxf\") pod \"openstack-operator-index-hpd82\" (UID: \"56e2bde0-6151-4863-a4f0-c74ed1454c00\") " pod="openstack-operators/openstack-operator-index-hpd82" Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.465890 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzvxf\" (UniqueName: \"kubernetes.io/projected/56e2bde0-6151-4863-a4f0-c74ed1454c00-kube-api-access-wzvxf\") pod \"openstack-operator-index-hpd82\" (UID: \"56e2bde0-6151-4863-a4f0-c74ed1454c00\") " pod="openstack-operators/openstack-operator-index-hpd82" Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.483273 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzvxf\" (UniqueName: \"kubernetes.io/projected/56e2bde0-6151-4863-a4f0-c74ed1454c00-kube-api-access-wzvxf\") pod \"openstack-operator-index-hpd82\" (UID: \"56e2bde0-6151-4863-a4f0-c74ed1454c00\") " pod="openstack-operators/openstack-operator-index-hpd82" Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.541765 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hpd82" Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.867314 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5rlln" Nov 27 11:22:05 crc kubenswrapper[4807]: I1127 11:22:05.903855 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5rlln" Nov 27 11:22:06 crc kubenswrapper[4807]: I1127 11:22:06.004068 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hpd82"] Nov 27 11:22:06 crc kubenswrapper[4807]: W1127 11:22:06.010399 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56e2bde0_6151_4863_a4f0_c74ed1454c00.slice/crio-b92388bbc958217f90ddbc87a8ceb34139e5c4f463d24384da079bcfa0912511 WatchSource:0}: Error finding container b92388bbc958217f90ddbc87a8ceb34139e5c4f463d24384da079bcfa0912511: Status 404 returned error can't find the container with id b92388bbc958217f90ddbc87a8ceb34139e5c4f463d24384da079bcfa0912511 Nov 27 11:22:06 crc kubenswrapper[4807]: I1127 11:22:06.451056 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hpd82" event={"ID":"56e2bde0-6151-4863-a4f0-c74ed1454c00","Type":"ContainerStarted","Data":"b92388bbc958217f90ddbc87a8ceb34139e5c4f463d24384da079bcfa0912511"} Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.002047 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hpd82"] Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.466298 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hpd82" event={"ID":"56e2bde0-6151-4863-a4f0-c74ed1454c00","Type":"ContainerStarted","Data":"582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df"} Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.466419 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-hpd82" podUID="56e2bde0-6151-4863-a4f0-c74ed1454c00" containerName="registry-server" containerID="cri-o://582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df" gracePeriod=2 Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.520750 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hpd82" podStartSLOduration=1.5384177540000001 podStartE2EDuration="3.520724524s" podCreationTimestamp="2025-11-27 11:22:05 +0000 UTC" firstStartedPulling="2025-11-27 11:22:06.012508182 +0000 UTC m=+767.112006380" lastFinishedPulling="2025-11-27 11:22:07.994814952 +0000 UTC m=+769.094313150" observedRunningTime="2025-11-27 11:22:08.514412155 +0000 UTC m=+769.613910353" watchObservedRunningTime="2025-11-27 11:22:08.520724524 +0000 UTC m=+769.620222762" Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.603617 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-z2sjb"] Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.604356 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z2sjb" Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.617433 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z2sjb"] Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.718112 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svkv7\" (UniqueName: \"kubernetes.io/projected/c9606905-c5fd-4ebe-9942-b013364d7ca8-kube-api-access-svkv7\") pod \"openstack-operator-index-z2sjb\" (UID: \"c9606905-c5fd-4ebe-9942-b013364d7ca8\") " pod="openstack-operators/openstack-operator-index-z2sjb" Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.819301 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svkv7\" (UniqueName: \"kubernetes.io/projected/c9606905-c5fd-4ebe-9942-b013364d7ca8-kube-api-access-svkv7\") pod \"openstack-operator-index-z2sjb\" (UID: \"c9606905-c5fd-4ebe-9942-b013364d7ca8\") " pod="openstack-operators/openstack-operator-index-z2sjb" Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.838515 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svkv7\" (UniqueName: \"kubernetes.io/projected/c9606905-c5fd-4ebe-9942-b013364d7ca8-kube-api-access-svkv7\") pod \"openstack-operator-index-z2sjb\" (UID: \"c9606905-c5fd-4ebe-9942-b013364d7ca8\") " pod="openstack-operators/openstack-operator-index-z2sjb" Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.884034 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hpd82" Nov 27 11:22:08 crc kubenswrapper[4807]: I1127 11:22:08.932678 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-z2sjb" Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.023812 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzvxf\" (UniqueName: \"kubernetes.io/projected/56e2bde0-6151-4863-a4f0-c74ed1454c00-kube-api-access-wzvxf\") pod \"56e2bde0-6151-4863-a4f0-c74ed1454c00\" (UID: \"56e2bde0-6151-4863-a4f0-c74ed1454c00\") " Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.029008 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56e2bde0-6151-4863-a4f0-c74ed1454c00-kube-api-access-wzvxf" (OuterVolumeSpecName: "kube-api-access-wzvxf") pod "56e2bde0-6151-4863-a4f0-c74ed1454c00" (UID: "56e2bde0-6151-4863-a4f0-c74ed1454c00"). InnerVolumeSpecName "kube-api-access-wzvxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.125289 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzvxf\" (UniqueName: \"kubernetes.io/projected/56e2bde0-6151-4863-a4f0-c74ed1454c00-kube-api-access-wzvxf\") on node \"crc\" DevicePath \"\"" Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.326220 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-z2sjb"] Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.472966 4807 generic.go:334] "Generic (PLEG): container finished" podID="56e2bde0-6151-4863-a4f0-c74ed1454c00" containerID="582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df" exitCode=0 Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.473032 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hpd82" Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.473063 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hpd82" event={"ID":"56e2bde0-6151-4863-a4f0-c74ed1454c00","Type":"ContainerDied","Data":"582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df"} Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.473119 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hpd82" event={"ID":"56e2bde0-6151-4863-a4f0-c74ed1454c00","Type":"ContainerDied","Data":"b92388bbc958217f90ddbc87a8ceb34139e5c4f463d24384da079bcfa0912511"} Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.473138 4807 scope.go:117] "RemoveContainer" containerID="582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df" Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.476735 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z2sjb" event={"ID":"c9606905-c5fd-4ebe-9942-b013364d7ca8","Type":"ContainerStarted","Data":"ac29a9fd135a9c4ff97a14e37323dacff4161a1b49e40e1a067d386bb186f842"} Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.495720 4807 scope.go:117] "RemoveContainer" containerID="582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df" Nov 27 11:22:09 crc kubenswrapper[4807]: E1127 11:22:09.496165 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df\": container with ID starting with 582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df not found: ID does not exist" containerID="582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df" Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.496202 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df"} err="failed to get container status \"582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df\": rpc error: code = NotFound desc = could not find container \"582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df\": container with ID starting with 582c081a4b0073286c88b6dd8d931927cf76cccd567e0cada9606953749b23df not found: ID does not exist" Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.503230 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-hpd82"] Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.507261 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-hpd82"] Nov 27 11:22:09 crc kubenswrapper[4807]: I1127 11:22:09.542570 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56e2bde0-6151-4863-a4f0-c74ed1454c00" path="/var/lib/kubelet/pods/56e2bde0-6151-4863-a4f0-c74ed1454c00/volumes" Nov 27 11:22:10 crc kubenswrapper[4807]: I1127 11:22:10.489222 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-z2sjb" event={"ID":"c9606905-c5fd-4ebe-9942-b013364d7ca8","Type":"ContainerStarted","Data":"e2f3760663a44c3c55fb5e69dc307957409d3898ffaecda2d7823d163742a539"} Nov 27 11:22:10 crc kubenswrapper[4807]: I1127 11:22:10.517548 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-z2sjb" podStartSLOduration=2.4623860300000002 podStartE2EDuration="2.517518721s" podCreationTimestamp="2025-11-27 11:22:08 +0000 UTC" firstStartedPulling="2025-11-27 11:22:09.336579632 +0000 UTC m=+770.436077830" lastFinishedPulling="2025-11-27 11:22:09.391712323 +0000 UTC m=+770.491210521" observedRunningTime="2025-11-27 11:22:10.511223683 +0000 UTC m=+771.610721941" watchObservedRunningTime="2025-11-27 11:22:10.517518721 +0000 UTC m=+771.617016949" Nov 27 11:22:10 crc kubenswrapper[4807]: I1127 11:22:10.873727 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5rlln" Nov 27 11:22:12 crc kubenswrapper[4807]: I1127 11:22:12.395302 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-xksm8" Nov 27 11:22:18 crc kubenswrapper[4807]: I1127 11:22:18.933057 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-z2sjb" Nov 27 11:22:18 crc kubenswrapper[4807]: I1127 11:22:18.933589 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-z2sjb" Nov 27 11:22:18 crc kubenswrapper[4807]: I1127 11:22:18.974821 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-z2sjb" Nov 27 11:22:19 crc kubenswrapper[4807]: I1127 11:22:19.577873 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-z2sjb" Nov 27 11:22:20 crc kubenswrapper[4807]: I1127 11:22:20.921429 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:22:20 crc kubenswrapper[4807]: I1127 11:22:20.923240 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:22:22 crc kubenswrapper[4807]: I1127 11:22:22.852372 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h"] Nov 27 11:22:22 crc kubenswrapper[4807]: E1127 11:22:22.852907 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56e2bde0-6151-4863-a4f0-c74ed1454c00" containerName="registry-server" Nov 27 11:22:22 crc kubenswrapper[4807]: I1127 11:22:22.852921 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="56e2bde0-6151-4863-a4f0-c74ed1454c00" containerName="registry-server" Nov 27 11:22:22 crc kubenswrapper[4807]: I1127 11:22:22.853033 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="56e2bde0-6151-4863-a4f0-c74ed1454c00" containerName="registry-server" Nov 27 11:22:22 crc kubenswrapper[4807]: I1127 11:22:22.853995 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:22 crc kubenswrapper[4807]: I1127 11:22:22.856154 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-jxtz8" Nov 27 11:22:22 crc kubenswrapper[4807]: I1127 11:22:22.866944 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h"] Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.040740 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-util\") pod \"d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.041018 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-bundle\") pod \"d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.041114 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4pt6\" (UniqueName: \"kubernetes.io/projected/3ae63091-3a7b-4708-82c7-d59383b22b9b-kube-api-access-p4pt6\") pod \"d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.142144 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-util\") pod \"d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.142585 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-bundle\") pod \"d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.142730 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4pt6\" (UniqueName: \"kubernetes.io/projected/3ae63091-3a7b-4708-82c7-d59383b22b9b-kube-api-access-p4pt6\") pod \"d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.143095 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-bundle\") pod \"d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.143111 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-util\") pod \"d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.160618 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4pt6\" (UniqueName: \"kubernetes.io/projected/3ae63091-3a7b-4708-82c7-d59383b22b9b-kube-api-access-p4pt6\") pod \"d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.188713 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:23 crc kubenswrapper[4807]: I1127 11:22:23.604929 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h"] Nov 27 11:22:23 crc kubenswrapper[4807]: W1127 11:22:23.619664 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae63091_3a7b_4708_82c7_d59383b22b9b.slice/crio-12ca370a2792f08740f2ae8aeb8915685335fbd06b7f2a628ba3db829ddbd333 WatchSource:0}: Error finding container 12ca370a2792f08740f2ae8aeb8915685335fbd06b7f2a628ba3db829ddbd333: Status 404 returned error can't find the container with id 12ca370a2792f08740f2ae8aeb8915685335fbd06b7f2a628ba3db829ddbd333 Nov 27 11:22:24 crc kubenswrapper[4807]: I1127 11:22:24.592548 4807 generic.go:334] "Generic (PLEG): container finished" podID="3ae63091-3a7b-4708-82c7-d59383b22b9b" containerID="a6759bfcf96b1135e297819e8359f71bdb84c178f79cebaf171e3bd28a1c5bc8" exitCode=0 Nov 27 11:22:24 crc kubenswrapper[4807]: I1127 11:22:24.592718 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" event={"ID":"3ae63091-3a7b-4708-82c7-d59383b22b9b","Type":"ContainerDied","Data":"a6759bfcf96b1135e297819e8359f71bdb84c178f79cebaf171e3bd28a1c5bc8"} Nov 27 11:22:24 crc kubenswrapper[4807]: I1127 11:22:24.592872 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" event={"ID":"3ae63091-3a7b-4708-82c7-d59383b22b9b","Type":"ContainerStarted","Data":"12ca370a2792f08740f2ae8aeb8915685335fbd06b7f2a628ba3db829ddbd333"} Nov 27 11:22:25 crc kubenswrapper[4807]: I1127 11:22:25.603622 4807 generic.go:334] "Generic (PLEG): container finished" podID="3ae63091-3a7b-4708-82c7-d59383b22b9b" containerID="bea28c7cc5c277ac1e78e66ca6bb007a6e963bdfd1b25f6a1b1675f38cdf4a61" exitCode=0 Nov 27 11:22:25 crc kubenswrapper[4807]: I1127 11:22:25.603707 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" event={"ID":"3ae63091-3a7b-4708-82c7-d59383b22b9b","Type":"ContainerDied","Data":"bea28c7cc5c277ac1e78e66ca6bb007a6e963bdfd1b25f6a1b1675f38cdf4a61"} Nov 27 11:22:26 crc kubenswrapper[4807]: I1127 11:22:26.614341 4807 generic.go:334] "Generic (PLEG): container finished" podID="3ae63091-3a7b-4708-82c7-d59383b22b9b" containerID="d6017787444c22979e6b7ed7bd3bf4dc315ce3faf942f79ebf63dcbb9d72ab29" exitCode=0 Nov 27 11:22:26 crc kubenswrapper[4807]: I1127 11:22:26.614404 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" event={"ID":"3ae63091-3a7b-4708-82c7-d59383b22b9b","Type":"ContainerDied","Data":"d6017787444c22979e6b7ed7bd3bf4dc315ce3faf942f79ebf63dcbb9d72ab29"} Nov 27 11:22:27 crc kubenswrapper[4807]: I1127 11:22:27.940232 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.024965 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-bundle\") pod \"3ae63091-3a7b-4708-82c7-d59383b22b9b\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.025550 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-util\") pod \"3ae63091-3a7b-4708-82c7-d59383b22b9b\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.025650 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4pt6\" (UniqueName: \"kubernetes.io/projected/3ae63091-3a7b-4708-82c7-d59383b22b9b-kube-api-access-p4pt6\") pod \"3ae63091-3a7b-4708-82c7-d59383b22b9b\" (UID: \"3ae63091-3a7b-4708-82c7-d59383b22b9b\") " Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.026654 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-bundle" (OuterVolumeSpecName: "bundle") pod "3ae63091-3a7b-4708-82c7-d59383b22b9b" (UID: "3ae63091-3a7b-4708-82c7-d59383b22b9b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.032618 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae63091-3a7b-4708-82c7-d59383b22b9b-kube-api-access-p4pt6" (OuterVolumeSpecName: "kube-api-access-p4pt6") pod "3ae63091-3a7b-4708-82c7-d59383b22b9b" (UID: "3ae63091-3a7b-4708-82c7-d59383b22b9b"). InnerVolumeSpecName "kube-api-access-p4pt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.039819 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-util" (OuterVolumeSpecName: "util") pod "3ae63091-3a7b-4708-82c7-d59383b22b9b" (UID: "3ae63091-3a7b-4708-82c7-d59383b22b9b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.127348 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4pt6\" (UniqueName: \"kubernetes.io/projected/3ae63091-3a7b-4708-82c7-d59383b22b9b-kube-api-access-p4pt6\") on node \"crc\" DevicePath \"\"" Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.127413 4807 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.127427 4807 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3ae63091-3a7b-4708-82c7-d59383b22b9b-util\") on node \"crc\" DevicePath \"\"" Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.629996 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" event={"ID":"3ae63091-3a7b-4708-82c7-d59383b22b9b","Type":"ContainerDied","Data":"12ca370a2792f08740f2ae8aeb8915685335fbd06b7f2a628ba3db829ddbd333"} Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.630043 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12ca370a2792f08740f2ae8aeb8915685335fbd06b7f2a628ba3db829ddbd333" Nov 27 11:22:28 crc kubenswrapper[4807]: I1127 11:22:28.630062 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h" Nov 27 11:22:34 crc kubenswrapper[4807]: I1127 11:22:34.857794 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb"] Nov 27 11:22:34 crc kubenswrapper[4807]: E1127 11:22:34.858487 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae63091-3a7b-4708-82c7-d59383b22b9b" containerName="extract" Nov 27 11:22:34 crc kubenswrapper[4807]: I1127 11:22:34.858498 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae63091-3a7b-4708-82c7-d59383b22b9b" containerName="extract" Nov 27 11:22:34 crc kubenswrapper[4807]: E1127 11:22:34.858513 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae63091-3a7b-4708-82c7-d59383b22b9b" containerName="util" Nov 27 11:22:34 crc kubenswrapper[4807]: I1127 11:22:34.858518 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae63091-3a7b-4708-82c7-d59383b22b9b" containerName="util" Nov 27 11:22:34 crc kubenswrapper[4807]: E1127 11:22:34.858530 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae63091-3a7b-4708-82c7-d59383b22b9b" containerName="pull" Nov 27 11:22:34 crc kubenswrapper[4807]: I1127 11:22:34.858537 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae63091-3a7b-4708-82c7-d59383b22b9b" containerName="pull" Nov 27 11:22:34 crc kubenswrapper[4807]: I1127 11:22:34.858657 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae63091-3a7b-4708-82c7-d59383b22b9b" containerName="extract" Nov 27 11:22:34 crc kubenswrapper[4807]: I1127 11:22:34.859042 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb" Nov 27 11:22:34 crc kubenswrapper[4807]: I1127 11:22:34.864544 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-dpv45" Nov 27 11:22:34 crc kubenswrapper[4807]: I1127 11:22:34.879841 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb"] Nov 27 11:22:35 crc kubenswrapper[4807]: I1127 11:22:35.008976 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkjv8\" (UniqueName: \"kubernetes.io/projected/ffe745e4-da98-4391-990b-a86d2fbc3346-kube-api-access-wkjv8\") pod \"openstack-operator-controller-operator-59f78dbdf9-fzjdb\" (UID: \"ffe745e4-da98-4391-990b-a86d2fbc3346\") " pod="openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb" Nov 27 11:22:35 crc kubenswrapper[4807]: I1127 11:22:35.110430 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkjv8\" (UniqueName: \"kubernetes.io/projected/ffe745e4-da98-4391-990b-a86d2fbc3346-kube-api-access-wkjv8\") pod \"openstack-operator-controller-operator-59f78dbdf9-fzjdb\" (UID: \"ffe745e4-da98-4391-990b-a86d2fbc3346\") " pod="openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb" Nov 27 11:22:35 crc kubenswrapper[4807]: I1127 11:22:35.128456 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkjv8\" (UniqueName: \"kubernetes.io/projected/ffe745e4-da98-4391-990b-a86d2fbc3346-kube-api-access-wkjv8\") pod \"openstack-operator-controller-operator-59f78dbdf9-fzjdb\" (UID: \"ffe745e4-da98-4391-990b-a86d2fbc3346\") " pod="openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb" Nov 27 11:22:35 crc kubenswrapper[4807]: I1127 11:22:35.177198 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb" Nov 27 11:22:35 crc kubenswrapper[4807]: I1127 11:22:35.624095 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb"] Nov 27 11:22:35 crc kubenswrapper[4807]: I1127 11:22:35.679313 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb" event={"ID":"ffe745e4-da98-4391-990b-a86d2fbc3346","Type":"ContainerStarted","Data":"7168989c57c07b14284e77f36bc96f0c9d1920557991be010e0e4a341752f242"} Nov 27 11:22:39 crc kubenswrapper[4807]: I1127 11:22:39.705288 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb" event={"ID":"ffe745e4-da98-4391-990b-a86d2fbc3346","Type":"ContainerStarted","Data":"1dc181b194bfb5e93b04893e26b884568c1d08bff9e92cb2715809d1c7d893c6"} Nov 27 11:22:39 crc kubenswrapper[4807]: I1127 11:22:39.706388 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb" Nov 27 11:22:39 crc kubenswrapper[4807]: I1127 11:22:39.741344 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb" podStartSLOduration=1.8506532679999999 podStartE2EDuration="5.741320836s" podCreationTimestamp="2025-11-27 11:22:34 +0000 UTC" firstStartedPulling="2025-11-27 11:22:35.633048422 +0000 UTC m=+796.732546620" lastFinishedPulling="2025-11-27 11:22:39.52371599 +0000 UTC m=+800.623214188" observedRunningTime="2025-11-27 11:22:39.738493971 +0000 UTC m=+800.837992169" watchObservedRunningTime="2025-11-27 11:22:39.741320836 +0000 UTC m=+800.840819044" Nov 27 11:22:45 crc kubenswrapper[4807]: I1127 11:22:45.180949 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-59f78dbdf9-fzjdb" Nov 27 11:22:50 crc kubenswrapper[4807]: I1127 11:22:50.921548 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:22:50 crc kubenswrapper[4807]: I1127 11:22:50.922067 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.167236 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.169524 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.171134 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7fg88" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.179497 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.180582 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.182492 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-t4v78" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.194229 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.199098 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.207400 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-5wfgl"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.208365 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.210861 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-dnzpn" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.219305 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.220502 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.222403 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-snxjh" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.225379 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-5wfgl"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.231854 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.239598 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.240560 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.242665 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-2j7m7" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.259332 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.260344 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.263667 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9kdhn" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.268407 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.285225 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.295492 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6qch\" (UniqueName: \"kubernetes.io/projected/623644bf-2d87-4689-acea-cfaeca90285f-kube-api-access-x6qch\") pod \"cinder-operator-controller-manager-6b7f75547b-6sqtw\" (UID: \"623644bf-2d87-4689-acea-cfaeca90285f\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.295555 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd7tx\" (UniqueName: \"kubernetes.io/projected/7377040f-fbf5-4395-a903-99dbb10dbcac-kube-api-access-rd7tx\") pod \"barbican-operator-controller-manager-7b64f4fb85-z6w5s\" (UID: \"7377040f-fbf5-4395-a903-99dbb10dbcac\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.306360 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.308263 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.309002 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.309145 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.315655 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.316293 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9s7zc" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.316412 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-74qcd" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.330513 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.344693 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.345779 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.353926 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.359500 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mbknh" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.362382 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.371053 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.372102 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.375560 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-x4tqd" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.384721 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.385716 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.388165 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wd4hv" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.394224 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.399784 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65hwt\" (UniqueName: \"kubernetes.io/projected/c5b9cfda-ea17-4add-a121-036a989efeab-kube-api-access-65hwt\") pod \"heat-operator-controller-manager-5b77f656f-j2tq6\" (UID: \"c5b9cfda-ea17-4add-a121-036a989efeab\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.399825 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k96gk\" (UniqueName: \"kubernetes.io/projected/574b2edd-5058-4d84-a8b8-72258c3c9f7b-kube-api-access-k96gk\") pod \"designate-operator-controller-manager-955677c94-5wfgl\" (UID: \"574b2edd-5058-4d84-a8b8-72258c3c9f7b\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.399865 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6qch\" (UniqueName: \"kubernetes.io/projected/623644bf-2d87-4689-acea-cfaeca90285f-kube-api-access-x6qch\") pod \"cinder-operator-controller-manager-6b7f75547b-6sqtw\" (UID: \"623644bf-2d87-4689-acea-cfaeca90285f\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.399887 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvxps\" (UniqueName: \"kubernetes.io/projected/4ae17b3e-8de9-45e3-8404-2f2fda6c6b99-kube-api-access-rvxps\") pod \"horizon-operator-controller-manager-5d494799bf-fztl6\" (UID: \"4ae17b3e-8de9-45e3-8404-2f2fda6c6b99\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.399907 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd7tx\" (UniqueName: \"kubernetes.io/projected/7377040f-fbf5-4395-a903-99dbb10dbcac-kube-api-access-rd7tx\") pod \"barbican-operator-controller-manager-7b64f4fb85-z6w5s\" (UID: \"7377040f-fbf5-4395-a903-99dbb10dbcac\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.399967 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xgp8\" (UniqueName: \"kubernetes.io/projected/5ae030e1-b973-4137-abd1-1abc5f5d1153-kube-api-access-7xgp8\") pod \"glance-operator-controller-manager-589cbd6b5b-v9d6j\" (UID: \"5ae030e1-b973-4137-abd1-1abc5f5d1153\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.431098 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd7tx\" (UniqueName: \"kubernetes.io/projected/7377040f-fbf5-4395-a903-99dbb10dbcac-kube-api-access-rd7tx\") pod \"barbican-operator-controller-manager-7b64f4fb85-z6w5s\" (UID: \"7377040f-fbf5-4395-a903-99dbb10dbcac\") " pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.439119 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6qch\" (UniqueName: \"kubernetes.io/projected/623644bf-2d87-4689-acea-cfaeca90285f-kube-api-access-x6qch\") pod \"cinder-operator-controller-manager-6b7f75547b-6sqtw\" (UID: \"623644bf-2d87-4689-acea-cfaeca90285f\") " pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.446694 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.494417 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.495594 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.496717 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.497364 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-rw9h6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.501539 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65hwt\" (UniqueName: \"kubernetes.io/projected/c5b9cfda-ea17-4add-a121-036a989efeab-kube-api-access-65hwt\") pod \"heat-operator-controller-manager-5b77f656f-j2tq6\" (UID: \"c5b9cfda-ea17-4add-a121-036a989efeab\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.501602 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k96gk\" (UniqueName: \"kubernetes.io/projected/574b2edd-5058-4d84-a8b8-72258c3c9f7b-kube-api-access-k96gk\") pod \"designate-operator-controller-manager-955677c94-5wfgl\" (UID: \"574b2edd-5058-4d84-a8b8-72258c3c9f7b\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.501640 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvxps\" (UniqueName: \"kubernetes.io/projected/4ae17b3e-8de9-45e3-8404-2f2fda6c6b99-kube-api-access-rvxps\") pod \"horizon-operator-controller-manager-5d494799bf-fztl6\" (UID: \"4ae17b3e-8de9-45e3-8404-2f2fda6c6b99\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.501664 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct7bs\" (UniqueName: \"kubernetes.io/projected/4be44e13-06b8-494e-8a62-7e8d8747692f-kube-api-access-ct7bs\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.501684 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.501735 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v5hm\" (UniqueName: \"kubernetes.io/projected/9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b-kube-api-access-6v5hm\") pod \"ironic-operator-controller-manager-67cb4dc6d4-xtdbs\" (UID: \"9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.501755 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkn9w\" (UniqueName: \"kubernetes.io/projected/3ae6d3a5-8999-4c3d-a3de-b497ae0776f2-kube-api-access-qkn9w\") pod \"manila-operator-controller-manager-5d499bf58b-hcnzc\" (UID: \"3ae6d3a5-8999-4c3d-a3de-b497ae0776f2\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.501809 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bffd\" (UniqueName: \"kubernetes.io/projected/af2f67ab-040b-4ec1-bf21-db83dcaeb6d2-kube-api-access-7bffd\") pod \"keystone-operator-controller-manager-7b4567c7cf-lz6lg\" (UID: \"af2f67ab-040b-4ec1-bf21-db83dcaeb6d2\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.501827 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnzw9\" (UniqueName: \"kubernetes.io/projected/fe4ff55b-a2dd-4936-9016-d73ade2388a0-kube-api-access-lnzw9\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-5wb7l\" (UID: \"fe4ff55b-a2dd-4936-9016-d73ade2388a0\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.501850 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xgp8\" (UniqueName: \"kubernetes.io/projected/5ae030e1-b973-4137-abd1-1abc5f5d1153-kube-api-access-7xgp8\") pod \"glance-operator-controller-manager-589cbd6b5b-v9d6j\" (UID: \"5ae030e1-b973-4137-abd1-1abc5f5d1153\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.504156 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.508608 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.509975 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.512982 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-klk88" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.517343 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k96gk\" (UniqueName: \"kubernetes.io/projected/574b2edd-5058-4d84-a8b8-72258c3c9f7b-kube-api-access-k96gk\") pod \"designate-operator-controller-manager-955677c94-5wfgl\" (UID: \"574b2edd-5058-4d84-a8b8-72258c3c9f7b\") " pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.520693 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65hwt\" (UniqueName: \"kubernetes.io/projected/c5b9cfda-ea17-4add-a121-036a989efeab-kube-api-access-65hwt\") pod \"heat-operator-controller-manager-5b77f656f-j2tq6\" (UID: \"c5b9cfda-ea17-4add-a121-036a989efeab\") " pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.523835 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xgp8\" (UniqueName: \"kubernetes.io/projected/5ae030e1-b973-4137-abd1-1abc5f5d1153-kube-api-access-7xgp8\") pod \"glance-operator-controller-manager-589cbd6b5b-v9d6j\" (UID: \"5ae030e1-b973-4137-abd1-1abc5f5d1153\") " pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.529478 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.536222 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.546962 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvxps\" (UniqueName: \"kubernetes.io/projected/4ae17b3e-8de9-45e3-8404-2f2fda6c6b99-kube-api-access-rvxps\") pod \"horizon-operator-controller-manager-5d494799bf-fztl6\" (UID: \"4ae17b3e-8de9-45e3-8404-2f2fda6c6b99\") " pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.549649 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.554679 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.561082 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.561712 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.565078 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-gvv7k" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.570999 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.580843 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.587712 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.588966 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.589636 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.592706 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tqrrt" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.592895 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.594235 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.597331 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.598777 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-8qdf9" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.601692 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.602649 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct7bs\" (UniqueName: \"kubernetes.io/projected/4be44e13-06b8-494e-8a62-7e8d8747692f-kube-api-access-ct7bs\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.602679 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.602709 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v5hm\" (UniqueName: \"kubernetes.io/projected/9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b-kube-api-access-6v5hm\") pod \"ironic-operator-controller-manager-67cb4dc6d4-xtdbs\" (UID: \"9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.602742 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkn9w\" (UniqueName: \"kubernetes.io/projected/3ae6d3a5-8999-4c3d-a3de-b497ae0776f2-kube-api-access-qkn9w\") pod \"manila-operator-controller-manager-5d499bf58b-hcnzc\" (UID: \"3ae6d3a5-8999-4c3d-a3de-b497ae0776f2\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.602796 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8bwp\" (UniqueName: \"kubernetes.io/projected/1b313bc2-c896-486c-a520-9843ec7bd6ad-kube-api-access-d8bwp\") pod \"octavia-operator-controller-manager-64cdc6ff96-k82xf\" (UID: \"1b313bc2-c896-486c-a520-9843ec7bd6ad\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.602827 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bffd\" (UniqueName: \"kubernetes.io/projected/af2f67ab-040b-4ec1-bf21-db83dcaeb6d2-kube-api-access-7bffd\") pod \"keystone-operator-controller-manager-7b4567c7cf-lz6lg\" (UID: \"af2f67ab-040b-4ec1-bf21-db83dcaeb6d2\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.602844 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnzw9\" (UniqueName: \"kubernetes.io/projected/fe4ff55b-a2dd-4936-9016-d73ade2388a0-kube-api-access-lnzw9\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-5wb7l\" (UID: \"fe4ff55b-a2dd-4936-9016-d73ade2388a0\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.602864 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvr2w\" (UniqueName: \"kubernetes.io/projected/dcfd531a-2394-41c7-b05a-5b8e95f8459c-kube-api-access-kvr2w\") pod \"nova-operator-controller-manager-79556f57fc-6l4tm\" (UID: \"dcfd531a-2394-41c7-b05a-5b8e95f8459c\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" Nov 27 11:23:03 crc kubenswrapper[4807]: E1127 11:23:03.603988 4807 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 11:23:03 crc kubenswrapper[4807]: E1127 11:23:03.604263 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert podName:4be44e13-06b8-494e-8a62-7e8d8747692f nodeName:}" failed. No retries permitted until 2025-11-27 11:23:04.104227906 +0000 UTC m=+825.203726104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert") pod "infra-operator-controller-manager-57548d458d-qg4bq" (UID: "4be44e13-06b8-494e-8a62-7e8d8747692f") : secret "infra-operator-webhook-server-cert" not found Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.606916 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.611889 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-7z79l" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.613027 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.618164 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.635821 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bffd\" (UniqueName: \"kubernetes.io/projected/af2f67ab-040b-4ec1-bf21-db83dcaeb6d2-kube-api-access-7bffd\") pod \"keystone-operator-controller-manager-7b4567c7cf-lz6lg\" (UID: \"af2f67ab-040b-4ec1-bf21-db83dcaeb6d2\") " pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.636080 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.642659 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v5hm\" (UniqueName: \"kubernetes.io/projected/9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b-kube-api-access-6v5hm\") pod \"ironic-operator-controller-manager-67cb4dc6d4-xtdbs\" (UID: \"9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b\") " pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.643107 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnzw9\" (UniqueName: \"kubernetes.io/projected/fe4ff55b-a2dd-4936-9016-d73ade2388a0-kube-api-access-lnzw9\") pod \"mariadb-operator-controller-manager-66f4dd4bc7-5wb7l\" (UID: \"fe4ff55b-a2dd-4936-9016-d73ade2388a0\") " pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.647816 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct7bs\" (UniqueName: \"kubernetes.io/projected/4be44e13-06b8-494e-8a62-7e8d8747692f-kube-api-access-ct7bs\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.664531 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkn9w\" (UniqueName: \"kubernetes.io/projected/3ae6d3a5-8999-4c3d-a3de-b497ae0776f2-kube-api-access-qkn9w\") pod \"manila-operator-controller-manager-5d499bf58b-hcnzc\" (UID: \"3ae6d3a5-8999-4c3d-a3de-b497ae0776f2\") " pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.671271 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.672837 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.674409 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.677150 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.680940 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-6f4b7" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.691211 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.694457 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.696263 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.697933 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4cjlv" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.701473 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.705429 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grjbh\" (UniqueName: \"kubernetes.io/projected/0bde1253-53c0-4864-b22e-dcf25751388e-kube-api-access-grjbh\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.705470 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2d92\" (UniqueName: \"kubernetes.io/projected/787c342b-413f-495a-8b31-bd8a01f35c3a-kube-api-access-s2d92\") pod \"ovn-operator-controller-manager-56897c768d-d9mj8\" (UID: \"787c342b-413f-495a-8b31-bd8a01f35c3a\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.705499 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8bwp\" (UniqueName: \"kubernetes.io/projected/1b313bc2-c896-486c-a520-9843ec7bd6ad-kube-api-access-d8bwp\") pod \"octavia-operator-controller-manager-64cdc6ff96-k82xf\" (UID: \"1b313bc2-c896-486c-a520-9843ec7bd6ad\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.705524 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvr2w\" (UniqueName: \"kubernetes.io/projected/dcfd531a-2394-41c7-b05a-5b8e95f8459c-kube-api-access-kvr2w\") pod \"nova-operator-controller-manager-79556f57fc-6l4tm\" (UID: \"dcfd531a-2394-41c7-b05a-5b8e95f8459c\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.705570 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.705594 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rc5s\" (UniqueName: \"kubernetes.io/projected/961001c9-3719-4306-8d38-b3c5d8e202bc-kube-api-access-8rc5s\") pod \"neutron-operator-controller-manager-6fdcddb789-xfrls\" (UID: \"961001c9-3719-4306-8d38-b3c5d8e202bc\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.705635 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsj4q\" (UniqueName: \"kubernetes.io/projected/a0e8d2a3-0f58-4a1d-9867-648001196d2e-kube-api-access-rsj4q\") pod \"placement-operator-controller-manager-57988cc5b5-cw9bt\" (UID: \"a0e8d2a3-0f58-4a1d-9867-648001196d2e\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.712774 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.730054 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8bwp\" (UniqueName: \"kubernetes.io/projected/1b313bc2-c896-486c-a520-9843ec7bd6ad-kube-api-access-d8bwp\") pod \"octavia-operator-controller-manager-64cdc6ff96-k82xf\" (UID: \"1b313bc2-c896-486c-a520-9843ec7bd6ad\") " pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.756391 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.760805 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.762870 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvr2w\" (UniqueName: \"kubernetes.io/projected/dcfd531a-2394-41c7-b05a-5b8e95f8459c-kube-api-access-kvr2w\") pod \"nova-operator-controller-manager-79556f57fc-6l4tm\" (UID: \"dcfd531a-2394-41c7-b05a-5b8e95f8459c\") " pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.763899 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-fsrl2" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.777130 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.793794 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.794919 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.813462 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-kmtgc" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.815740 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.815787 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg6wc\" (UniqueName: \"kubernetes.io/projected/94062b2f-3f5a-404d-9b0a-8b7f858e1322-kube-api-access-rg6wc\") pod \"swift-operator-controller-manager-d77b94747-pd7pj\" (UID: \"94062b2f-3f5a-404d-9b0a-8b7f858e1322\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.815810 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rc5s\" (UniqueName: \"kubernetes.io/projected/961001c9-3719-4306-8d38-b3c5d8e202bc-kube-api-access-8rc5s\") pod \"neutron-operator-controller-manager-6fdcddb789-xfrls\" (UID: \"961001c9-3719-4306-8d38-b3c5d8e202bc\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.815843 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr2vp\" (UniqueName: \"kubernetes.io/projected/5b554316-8e33-4fa8-a340-91d9e0f6b0de-kube-api-access-lr2vp\") pod \"telemetry-operator-controller-manager-76cc84c6bb-svq2j\" (UID: \"5b554316-8e33-4fa8-a340-91d9e0f6b0de\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.815887 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsj4q\" (UniqueName: \"kubernetes.io/projected/a0e8d2a3-0f58-4a1d-9867-648001196d2e-kube-api-access-rsj4q\") pod \"placement-operator-controller-manager-57988cc5b5-cw9bt\" (UID: \"a0e8d2a3-0f58-4a1d-9867-648001196d2e\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.815938 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grjbh\" (UniqueName: \"kubernetes.io/projected/0bde1253-53c0-4864-b22e-dcf25751388e-kube-api-access-grjbh\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.815964 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2d92\" (UniqueName: \"kubernetes.io/projected/787c342b-413f-495a-8b31-bd8a01f35c3a-kube-api-access-s2d92\") pod \"ovn-operator-controller-manager-56897c768d-d9mj8\" (UID: \"787c342b-413f-495a-8b31-bd8a01f35c3a\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" Nov 27 11:23:03 crc kubenswrapper[4807]: E1127 11:23:03.816414 4807 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 11:23:03 crc kubenswrapper[4807]: E1127 11:23:03.816462 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert podName:0bde1253-53c0-4864-b22e-dcf25751388e nodeName:}" failed. No retries permitted until 2025-11-27 11:23:04.316446708 +0000 UTC m=+825.415944906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" (UID: "0bde1253-53c0-4864-b22e-dcf25751388e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.844581 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grjbh\" (UniqueName: \"kubernetes.io/projected/0bde1253-53c0-4864-b22e-dcf25751388e-kube-api-access-grjbh\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.859114 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.866984 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2d92\" (UniqueName: \"kubernetes.io/projected/787c342b-413f-495a-8b31-bd8a01f35c3a-kube-api-access-s2d92\") pod \"ovn-operator-controller-manager-56897c768d-d9mj8\" (UID: \"787c342b-413f-495a-8b31-bd8a01f35c3a\") " pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.889831 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsj4q\" (UniqueName: \"kubernetes.io/projected/a0e8d2a3-0f58-4a1d-9867-648001196d2e-kube-api-access-rsj4q\") pod \"placement-operator-controller-manager-57988cc5b5-cw9bt\" (UID: \"a0e8d2a3-0f58-4a1d-9867-648001196d2e\") " pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.890568 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rc5s\" (UniqueName: \"kubernetes.io/projected/961001c9-3719-4306-8d38-b3c5d8e202bc-kube-api-access-8rc5s\") pod \"neutron-operator-controller-manager-6fdcddb789-xfrls\" (UID: \"961001c9-3719-4306-8d38-b3c5d8e202bc\") " pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.892882 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.911294 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.912333 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.914313 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.914354 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-f8df9" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.914754 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.918137 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wss5m\" (UniqueName: \"kubernetes.io/projected/13749acc-f727-4c3a-b24a-b56bd6b7533d-kube-api-access-wss5m\") pod \"watcher-operator-controller-manager-656dcb59d4-tj4db\" (UID: \"13749acc-f727-4c3a-b24a-b56bd6b7533d\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.918191 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg6wc\" (UniqueName: \"kubernetes.io/projected/94062b2f-3f5a-404d-9b0a-8b7f858e1322-kube-api-access-rg6wc\") pod \"swift-operator-controller-manager-d77b94747-pd7pj\" (UID: \"94062b2f-3f5a-404d-9b0a-8b7f858e1322\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.918222 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr2vp\" (UniqueName: \"kubernetes.io/projected/5b554316-8e33-4fa8-a340-91d9e0f6b0de-kube-api-access-lr2vp\") pod \"telemetry-operator-controller-manager-76cc84c6bb-svq2j\" (UID: \"5b554316-8e33-4fa8-a340-91d9e0f6b0de\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.918309 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkt44\" (UniqueName: \"kubernetes.io/projected/7fbca001-90e9-4da2-bd14-6bc00a48ed40-kube-api-access-bkt44\") pod \"test-operator-controller-manager-5cd6c7f4c8-mw4mw\" (UID: \"7fbca001-90e9-4da2-bd14-6bc00a48ed40\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.922127 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.929077 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.938540 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.945413 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg6wc\" (UniqueName: \"kubernetes.io/projected/94062b2f-3f5a-404d-9b0a-8b7f858e1322-kube-api-access-rg6wc\") pod \"swift-operator-controller-manager-d77b94747-pd7pj\" (UID: \"94062b2f-3f5a-404d-9b0a-8b7f858e1322\") " pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.950527 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr2vp\" (UniqueName: \"kubernetes.io/projected/5b554316-8e33-4fa8-a340-91d9e0f6b0de-kube-api-access-lr2vp\") pod \"telemetry-operator-controller-manager-76cc84c6bb-svq2j\" (UID: \"5b554316-8e33-4fa8-a340-91d9e0f6b0de\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.965682 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.980801 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.982184 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.987137 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc"] Nov 27 11:23:03 crc kubenswrapper[4807]: I1127 11:23:03.988352 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-w2gvx" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.018938 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wss5m\" (UniqueName: \"kubernetes.io/projected/13749acc-f727-4c3a-b24a-b56bd6b7533d-kube-api-access-wss5m\") pod \"watcher-operator-controller-manager-656dcb59d4-tj4db\" (UID: \"13749acc-f727-4c3a-b24a-b56bd6b7533d\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.018974 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.019025 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tf88\" (UniqueName: \"kubernetes.io/projected/649aedb9-ad77-47fa-a7e9-89cb12c65928-kube-api-access-2tf88\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.019085 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkt44\" (UniqueName: \"kubernetes.io/projected/7fbca001-90e9-4da2-bd14-6bc00a48ed40-kube-api-access-bkt44\") pod \"test-operator-controller-manager-5cd6c7f4c8-mw4mw\" (UID: \"7fbca001-90e9-4da2-bd14-6bc00a48ed40\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.019108 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.024300 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.039476 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkt44\" (UniqueName: \"kubernetes.io/projected/7fbca001-90e9-4da2-bd14-6bc00a48ed40-kube-api-access-bkt44\") pod \"test-operator-controller-manager-5cd6c7f4c8-mw4mw\" (UID: \"7fbca001-90e9-4da2-bd14-6bc00a48ed40\") " pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.039784 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.043992 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.048770 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wss5m\" (UniqueName: \"kubernetes.io/projected/13749acc-f727-4c3a-b24a-b56bd6b7533d-kube-api-access-wss5m\") pod \"watcher-operator-controller-manager-656dcb59d4-tj4db\" (UID: \"13749acc-f727-4c3a-b24a-b56bd6b7533d\") " pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.100988 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.119884 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.119930 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4njgj\" (UniqueName: \"kubernetes.io/projected/8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91-kube-api-access-4njgj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l72vc\" (UID: \"8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.119981 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tf88\" (UniqueName: \"kubernetes.io/projected/649aedb9-ad77-47fa-a7e9-89cb12c65928-kube-api-access-2tf88\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.120003 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.120052 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.120178 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.120230 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs podName:649aedb9-ad77-47fa-a7e9-89cb12c65928 nodeName:}" failed. No retries permitted until 2025-11-27 11:23:04.620214653 +0000 UTC m=+825.719712851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs") pod "openstack-operator-controller-manager-6456fcdb48-tjnrt" (UID: "649aedb9-ad77-47fa-a7e9-89cb12c65928") : secret "webhook-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.120526 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.120555 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs podName:649aedb9-ad77-47fa-a7e9-89cb12c65928 nodeName:}" failed. No retries permitted until 2025-11-27 11:23:04.620547742 +0000 UTC m=+825.720045940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs") pod "openstack-operator-controller-manager-6456fcdb48-tjnrt" (UID: "649aedb9-ad77-47fa-a7e9-89cb12c65928") : secret "metrics-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.120778 4807 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.120805 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert podName:4be44e13-06b8-494e-8a62-7e8d8747692f nodeName:}" failed. No retries permitted until 2025-11-27 11:23:05.120798319 +0000 UTC m=+826.220296517 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert") pod "infra-operator-controller-manager-57548d458d-qg4bq" (UID: "4be44e13-06b8-494e-8a62-7e8d8747692f") : secret "infra-operator-webhook-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.126527 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.143672 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.144181 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tf88\" (UniqueName: \"kubernetes.io/projected/649aedb9-ad77-47fa-a7e9-89cb12c65928-kube-api-access-2tf88\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.170370 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.221564 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4njgj\" (UniqueName: \"kubernetes.io/projected/8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91-kube-api-access-4njgj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l72vc\" (UID: \"8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.221949 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.244794 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4njgj\" (UniqueName: \"kubernetes.io/projected/8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91-kube-api-access-4njgj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-l72vc\" (UID: \"8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.323302 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.323464 4807 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.323515 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert podName:0bde1253-53c0-4864-b22e-dcf25751388e nodeName:}" failed. No retries permitted until 2025-11-27 11:23:05.323498197 +0000 UTC m=+826.422996385 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" (UID: "0bde1253-53c0-4864-b22e-dcf25751388e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.333120 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.362586 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.369042 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.378218 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-955677c94-5wfgl"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.387297 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6"] Nov 27 11:23:04 crc kubenswrapper[4807]: W1127 11:23:04.431874 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae17b3e_8de9_45e3_8404_2f2fda6c6b99.slice/crio-6a64001613baddb82acf0440e2b26e4d21bd848857df79c77dd9ac9b6a1ebbe1 WatchSource:0}: Error finding container 6a64001613baddb82acf0440e2b26e4d21bd848857df79c77dd9ac9b6a1ebbe1: Status 404 returned error can't find the container with id 6a64001613baddb82acf0440e2b26e4d21bd848857df79c77dd9ac9b6a1ebbe1 Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.537289 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.567136 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l"] Nov 27 11:23:04 crc kubenswrapper[4807]: W1127 11:23:04.581510 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae6d3a5_8999_4c3d_a3de_b497ae0776f2.slice/crio-fca467b0fc3b585b63e1b9eb27722d44e8dedbaf472890219938cadfe88e19e5 WatchSource:0}: Error finding container fca467b0fc3b585b63e1b9eb27722d44e8dedbaf472890219938cadfe88e19e5: Status 404 returned error can't find the container with id fca467b0fc3b585b63e1b9eb27722d44e8dedbaf472890219938cadfe88e19e5 Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.591985 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.627866 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.627958 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.629388 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.629468 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs podName:649aedb9-ad77-47fa-a7e9-89cb12c65928 nodeName:}" failed. No retries permitted until 2025-11-27 11:23:05.62944369 +0000 UTC m=+826.728941888 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs") pod "openstack-operator-controller-manager-6456fcdb48-tjnrt" (UID: "649aedb9-ad77-47fa-a7e9-89cb12c65928") : secret "webhook-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.629590 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.629641 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs podName:649aedb9-ad77-47fa-a7e9-89cb12c65928 nodeName:}" failed. No retries permitted until 2025-11-27 11:23:05.629618314 +0000 UTC m=+826.729116512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs") pod "openstack-operator-controller-manager-6456fcdb48-tjnrt" (UID: "649aedb9-ad77-47fa-a7e9-89cb12c65928") : secret "metrics-server-cert" not found Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.684486 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.702452 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.721558 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.728145 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8"] Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.745507 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf"] Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.765517 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d8bwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-64cdc6ff96-k82xf_openstack-operators(1b313bc2-c896-486c-a520-9843ec7bd6ad): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.768588 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d8bwp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-64cdc6ff96-k82xf_openstack-operators(1b313bc2-c896-486c-a520-9843ec7bd6ad): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.769775 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" podUID="1b313bc2-c896-486c-a520-9843ec7bd6ad" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.830775 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj"] Nov 27 11:23:04 crc kubenswrapper[4807]: W1127 11:23:04.849330 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94062b2f_3f5a_404d_9b0a_8b7f858e1322.slice/crio-9b9cf562b28676c4c568e6c4343a55e3c50702195ca5d77400fd595df3842c00 WatchSource:0}: Error finding container 9b9cf562b28676c4c568e6c4343a55e3c50702195ca5d77400fd595df3842c00: Status 404 returned error can't find the container with id 9b9cf562b28676c4c568e6c4343a55e3c50702195ca5d77400fd595df3842c00 Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.850908 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt"] Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.852270 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rg6wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d77b94747-pd7pj_openstack-operators(94062b2f-3f5a-404d-9b0a-8b7f858e1322): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.854083 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rg6wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d77b94747-pd7pj_openstack-operators(94062b2f-3f5a-404d-9b0a-8b7f858e1322): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.855296 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" podUID="94062b2f-3f5a-404d-9b0a-8b7f858e1322" Nov 27 11:23:04 crc kubenswrapper[4807]: W1127 11:23:04.860491 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b554316_8e33_4fa8_a340_91d9e0f6b0de.slice/crio-106ea2b88a99fca08a89cf49c9633207d3ee0800044134c3a02410136b3ee20c WatchSource:0}: Error finding container 106ea2b88a99fca08a89cf49c9633207d3ee0800044134c3a02410136b3ee20c: Status 404 returned error can't find the container with id 106ea2b88a99fca08a89cf49c9633207d3ee0800044134c3a02410136b3ee20c Nov 27 11:23:04 crc kubenswrapper[4807]: W1127 11:23:04.861954 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13749acc_f727_4c3a_b24a_b56bd6b7533d.slice/crio-b4c5e88c7092a6015b03a5b1f8e62726203e2a186943b83201347dc90912b4ec WatchSource:0}: Error finding container b4c5e88c7092a6015b03a5b1f8e62726203e2a186943b83201347dc90912b4ec: Status 404 returned error can't find the container with id b4c5e88c7092a6015b03a5b1f8e62726203e2a186943b83201347dc90912b4ec Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.863261 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lr2vp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-svq2j_openstack-operators(5b554316-8e33-4fa8-a340-91d9e0f6b0de): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.864229 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db"] Nov 27 11:23:04 crc kubenswrapper[4807]: W1127 11:23:04.865397 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0e8d2a3_0f58_4a1d_9867_648001196d2e.slice/crio-1322e0305255af44fbbd499a5c42a9c01b648565fcadb5a33afc3c9156552536 WatchSource:0}: Error finding container 1322e0305255af44fbbd499a5c42a9c01b648565fcadb5a33afc3c9156552536: Status 404 returned error can't find the container with id 1322e0305255af44fbbd499a5c42a9c01b648565fcadb5a33afc3c9156552536 Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.867588 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lr2vp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-svq2j_openstack-operators(5b554316-8e33-4fa8-a340-91d9e0f6b0de): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.867596 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wss5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-656dcb59d4-tj4db_openstack-operators(13749acc-f727-4c3a-b24a-b56bd6b7533d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.870126 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wss5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-656dcb59d4-tj4db_openstack-operators(13749acc-f727-4c3a-b24a-b56bd6b7533d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.870421 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" podUID="5b554316-8e33-4fa8-a340-91d9e0f6b0de" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.871225 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" podUID="13749acc-f727-4c3a-b24a-b56bd6b7533d" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.872731 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rsj4q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57988cc5b5-cw9bt_openstack-operators(a0e8d2a3-0f58-4a1d-9867-648001196d2e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.874463 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rsj4q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57988cc5b5-cw9bt_openstack-operators(a0e8d2a3-0f58-4a1d-9867-648001196d2e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.874695 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j"] Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.875797 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" podUID="a0e8d2a3-0f58-4a1d-9867-648001196d2e" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.903032 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" event={"ID":"dcfd531a-2394-41c7-b05a-5b8e95f8459c","Type":"ContainerStarted","Data":"d682dca86913bca3e0d552253991a6188a78eba17cd523631744af024bbda1bc"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.904144 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" event={"ID":"787c342b-413f-495a-8b31-bd8a01f35c3a","Type":"ContainerStarted","Data":"aefa51c01f50d61aff44ff3442a92977f4379a011ad883c44c84eb7bcf691e04"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.905125 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" event={"ID":"5ae030e1-b973-4137-abd1-1abc5f5d1153","Type":"ContainerStarted","Data":"ac620a67fe48c0fa03114ba8ccfb66c092599292c0eee50065117711efc3e867"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.906009 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" event={"ID":"574b2edd-5058-4d84-a8b8-72258c3c9f7b","Type":"ContainerStarted","Data":"001c39e1fd657c11ca989185c3dccaac8ea6da8add52025471c7b9f8951d3a76"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.907156 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" event={"ID":"1b313bc2-c896-486c-a520-9843ec7bd6ad","Type":"ContainerStarted","Data":"aa990a5816be4183af8f504646f72eb48d44f7a1aed669b68997d609e906c6ef"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.908730 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" event={"ID":"961001c9-3719-4306-8d38-b3c5d8e202bc","Type":"ContainerStarted","Data":"9a2896ca1d20bce98ed472fcca714c9a9cea2e8b52eb3d0a030640c0bb4c8bdc"} Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.909559 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" podUID="1b313bc2-c896-486c-a520-9843ec7bd6ad" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.910131 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" event={"ID":"623644bf-2d87-4689-acea-cfaeca90285f","Type":"ContainerStarted","Data":"90ec658437935a8a41dcdfebfbc8045a3a3add613247794022469eb59a492078"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.911291 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" event={"ID":"af2f67ab-040b-4ec1-bf21-db83dcaeb6d2","Type":"ContainerStarted","Data":"5d3fe1d01ad3b53895e79920a2dd6b6060121c6dab4a2690b7705d5aec41c4ac"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.912265 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" event={"ID":"c5b9cfda-ea17-4add-a121-036a989efeab","Type":"ContainerStarted","Data":"3f650819aac18c5339dfa90aa636998d3e6213ba6c8a56cc326ec22d3ac48cc0"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.913091 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" event={"ID":"7377040f-fbf5-4395-a903-99dbb10dbcac","Type":"ContainerStarted","Data":"5c9ca8f403d04f8395166870ddd20b2d216ba58d1709152699c03ddc7eb4c59e"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.914109 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" event={"ID":"fe4ff55b-a2dd-4936-9016-d73ade2388a0","Type":"ContainerStarted","Data":"eb5d1872883462c8ccdf1ccc472dda8a896e207096b733d36f092c43adbd0e9a"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.915019 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" event={"ID":"94062b2f-3f5a-404d-9b0a-8b7f858e1322","Type":"ContainerStarted","Data":"9b9cf562b28676c4c568e6c4343a55e3c50702195ca5d77400fd595df3842c00"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.916375 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" event={"ID":"5b554316-8e33-4fa8-a340-91d9e0f6b0de","Type":"ContainerStarted","Data":"106ea2b88a99fca08a89cf49c9633207d3ee0800044134c3a02410136b3ee20c"} Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.916870 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" podUID="94062b2f-3f5a-404d-9b0a-8b7f858e1322" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.917878 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" event={"ID":"a0e8d2a3-0f58-4a1d-9867-648001196d2e","Type":"ContainerStarted","Data":"1322e0305255af44fbbd499a5c42a9c01b648565fcadb5a33afc3c9156552536"} Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.918066 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" podUID="5b554316-8e33-4fa8-a340-91d9e0f6b0de" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.919603 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" event={"ID":"3ae6d3a5-8999-4c3d-a3de-b497ae0776f2","Type":"ContainerStarted","Data":"fca467b0fc3b585b63e1b9eb27722d44e8dedbaf472890219938cadfe88e19e5"} Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.919953 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" podUID="a0e8d2a3-0f58-4a1d-9867-648001196d2e" Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.928456 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" event={"ID":"9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b","Type":"ContainerStarted","Data":"63b3794737d0ea4e521828877bafcc24dd5d40d99bf12f39755650b1c91472ef"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.929917 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" event={"ID":"4ae17b3e-8de9-45e3-8404-2f2fda6c6b99","Type":"ContainerStarted","Data":"6a64001613baddb82acf0440e2b26e4d21bd848857df79c77dd9ac9b6a1ebbe1"} Nov 27 11:23:04 crc kubenswrapper[4807]: I1127 11:23:04.933232 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" event={"ID":"13749acc-f727-4c3a-b24a-b56bd6b7533d","Type":"ContainerStarted","Data":"b4c5e88c7092a6015b03a5b1f8e62726203e2a186943b83201347dc90912b4ec"} Nov 27 11:23:04 crc kubenswrapper[4807]: E1127 11:23:04.935175 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" podUID="13749acc-f727-4c3a-b24a-b56bd6b7533d" Nov 27 11:23:05 crc kubenswrapper[4807]: I1127 11:23:05.002112 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc"] Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.012850 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4njgj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-l72vc_openstack-operators(8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.014056 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" podUID="8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91" Nov 27 11:23:05 crc kubenswrapper[4807]: I1127 11:23:05.023266 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw"] Nov 27 11:23:05 crc kubenswrapper[4807]: I1127 11:23:05.137305 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.137567 4807 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.137616 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert podName:4be44e13-06b8-494e-8a62-7e8d8747692f nodeName:}" failed. No retries permitted until 2025-11-27 11:23:07.137601107 +0000 UTC m=+828.237099305 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert") pod "infra-operator-controller-manager-57548d458d-qg4bq" (UID: "4be44e13-06b8-494e-8a62-7e8d8747692f") : secret "infra-operator-webhook-server-cert" not found Nov 27 11:23:05 crc kubenswrapper[4807]: I1127 11:23:05.343027 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.343179 4807 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.343225 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert podName:0bde1253-53c0-4864-b22e-dcf25751388e nodeName:}" failed. No retries permitted until 2025-11-27 11:23:07.343212733 +0000 UTC m=+828.442710931 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" (UID: "0bde1253-53c0-4864-b22e-dcf25751388e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 11:23:05 crc kubenswrapper[4807]: I1127 11:23:05.666932 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:05 crc kubenswrapper[4807]: I1127 11:23:05.667013 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.667159 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.667212 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs podName:649aedb9-ad77-47fa-a7e9-89cb12c65928 nodeName:}" failed. No retries permitted until 2025-11-27 11:23:07.667194528 +0000 UTC m=+828.766692726 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs") pod "openstack-operator-controller-manager-6456fcdb48-tjnrt" (UID: "649aedb9-ad77-47fa-a7e9-89cb12c65928") : secret "metrics-server-cert" not found Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.667285 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.667309 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs podName:649aedb9-ad77-47fa-a7e9-89cb12c65928 nodeName:}" failed. No retries permitted until 2025-11-27 11:23:07.6673018 +0000 UTC m=+828.766799998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs") pod "openstack-operator-controller-manager-6456fcdb48-tjnrt" (UID: "649aedb9-ad77-47fa-a7e9-89cb12c65928") : secret "webhook-server-cert" not found Nov 27 11:23:05 crc kubenswrapper[4807]: I1127 11:23:05.950781 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" event={"ID":"7fbca001-90e9-4da2-bd14-6bc00a48ed40","Type":"ContainerStarted","Data":"2fd11f60db5e9ee6c27f88f2029f5847c4db9a10e25f3fcfd189db209eda38de"} Nov 27 11:23:05 crc kubenswrapper[4807]: I1127 11:23:05.956534 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" event={"ID":"8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91","Type":"ContainerStarted","Data":"4aed80e0021579745198598d301817c57ce79073017c65fea2003b3d2c8f2c58"} Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.961168 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" podUID="8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91" Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.961962 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" podUID="94062b2f-3f5a-404d-9b0a-8b7f858e1322" Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.962281 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" podUID="5b554316-8e33-4fa8-a340-91d9e0f6b0de" Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.963855 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ddc8a82f05930db8ee7a8d6d189b5a66373060656e4baf71ac302f89c477da4c\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" podUID="1b313bc2-c896-486c-a520-9843ec7bd6ad" Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.983709 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:225958f250a1075b69439d776a13acc45c78695c21abda23600fb53ca1640423\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" podUID="a0e8d2a3-0f58-4a1d-9867-648001196d2e" Nov 27 11:23:05 crc kubenswrapper[4807]: E1127 11:23:05.983789 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:6bed55b172b9ee8ccc3952cbfc543d8bd44e2690f6db94348a754152fd78f4cf\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" podUID="13749acc-f727-4c3a-b24a-b56bd6b7533d" Nov 27 11:23:06 crc kubenswrapper[4807]: E1127 11:23:06.968613 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" podUID="8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91" Nov 27 11:23:07 crc kubenswrapper[4807]: I1127 11:23:07.222235 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:07 crc kubenswrapper[4807]: E1127 11:23:07.222678 4807 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 11:23:07 crc kubenswrapper[4807]: E1127 11:23:07.223058 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert podName:4be44e13-06b8-494e-8a62-7e8d8747692f nodeName:}" failed. No retries permitted until 2025-11-27 11:23:11.22303124 +0000 UTC m=+832.322529438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert") pod "infra-operator-controller-manager-57548d458d-qg4bq" (UID: "4be44e13-06b8-494e-8a62-7e8d8747692f") : secret "infra-operator-webhook-server-cert" not found Nov 27 11:23:07 crc kubenswrapper[4807]: I1127 11:23:07.425195 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:07 crc kubenswrapper[4807]: E1127 11:23:07.425390 4807 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 11:23:07 crc kubenswrapper[4807]: E1127 11:23:07.425441 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert podName:0bde1253-53c0-4864-b22e-dcf25751388e nodeName:}" failed. No retries permitted until 2025-11-27 11:23:11.42542795 +0000 UTC m=+832.524926148 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" (UID: "0bde1253-53c0-4864-b22e-dcf25751388e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 11:23:07 crc kubenswrapper[4807]: I1127 11:23:07.729576 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:07 crc kubenswrapper[4807]: I1127 11:23:07.729666 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:07 crc kubenswrapper[4807]: E1127 11:23:07.729862 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 11:23:07 crc kubenswrapper[4807]: E1127 11:23:07.729897 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 11:23:07 crc kubenswrapper[4807]: E1127 11:23:07.729953 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs podName:649aedb9-ad77-47fa-a7e9-89cb12c65928 nodeName:}" failed. No retries permitted until 2025-11-27 11:23:11.729932695 +0000 UTC m=+832.829430893 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs") pod "openstack-operator-controller-manager-6456fcdb48-tjnrt" (UID: "649aedb9-ad77-47fa-a7e9-89cb12c65928") : secret "webhook-server-cert" not found Nov 27 11:23:07 crc kubenswrapper[4807]: E1127 11:23:07.729974 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs podName:649aedb9-ad77-47fa-a7e9-89cb12c65928 nodeName:}" failed. No retries permitted until 2025-11-27 11:23:11.729965945 +0000 UTC m=+832.829464143 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs") pod "openstack-operator-controller-manager-6456fcdb48-tjnrt" (UID: "649aedb9-ad77-47fa-a7e9-89cb12c65928") : secret "metrics-server-cert" not found Nov 27 11:23:11 crc kubenswrapper[4807]: I1127 11:23:11.286852 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:11 crc kubenswrapper[4807]: E1127 11:23:11.287003 4807 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 27 11:23:11 crc kubenswrapper[4807]: E1127 11:23:11.287608 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert podName:4be44e13-06b8-494e-8a62-7e8d8747692f nodeName:}" failed. No retries permitted until 2025-11-27 11:23:19.287592227 +0000 UTC m=+840.387090425 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert") pod "infra-operator-controller-manager-57548d458d-qg4bq" (UID: "4be44e13-06b8-494e-8a62-7e8d8747692f") : secret "infra-operator-webhook-server-cert" not found Nov 27 11:23:11 crc kubenswrapper[4807]: I1127 11:23:11.490073 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:11 crc kubenswrapper[4807]: E1127 11:23:11.490321 4807 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 11:23:11 crc kubenswrapper[4807]: E1127 11:23:11.490403 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert podName:0bde1253-53c0-4864-b22e-dcf25751388e nodeName:}" failed. No retries permitted until 2025-11-27 11:23:19.490383148 +0000 UTC m=+840.589881346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert") pod "openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" (UID: "0bde1253-53c0-4864-b22e-dcf25751388e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 27 11:23:11 crc kubenswrapper[4807]: I1127 11:23:11.795690 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:11 crc kubenswrapper[4807]: I1127 11:23:11.795808 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:11 crc kubenswrapper[4807]: E1127 11:23:11.795942 4807 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 27 11:23:11 crc kubenswrapper[4807]: E1127 11:23:11.795948 4807 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Nov 27 11:23:11 crc kubenswrapper[4807]: E1127 11:23:11.795999 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs podName:649aedb9-ad77-47fa-a7e9-89cb12c65928 nodeName:}" failed. No retries permitted until 2025-11-27 11:23:19.795984982 +0000 UTC m=+840.895483180 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs") pod "openstack-operator-controller-manager-6456fcdb48-tjnrt" (UID: "649aedb9-ad77-47fa-a7e9-89cb12c65928") : secret "webhook-server-cert" not found Nov 27 11:23:11 crc kubenswrapper[4807]: E1127 11:23:11.796282 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs podName:649aedb9-ad77-47fa-a7e9-89cb12c65928 nodeName:}" failed. No retries permitted until 2025-11-27 11:23:19.796236499 +0000 UTC m=+840.895734697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs") pod "openstack-operator-controller-manager-6456fcdb48-tjnrt" (UID: "649aedb9-ad77-47fa-a7e9-89cb12c65928") : secret "metrics-server-cert" not found Nov 27 11:23:17 crc kubenswrapper[4807]: E1127 11:23:17.620025 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k96gk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-955677c94-5wfgl_openstack-operators(574b2edd-5058-4d84-a8b8-72258c3c9f7b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:17 crc kubenswrapper[4807]: E1127 11:23:17.621669 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-65hwt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5b77f656f-j2tq6_openstack-operators(c5b9cfda-ea17-4add-a121-036a989efeab): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:17 crc kubenswrapper[4807]: E1127 11:23:17.621732 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" podUID="574b2edd-5058-4d84-a8b8-72258c3c9f7b" Nov 27 11:23:17 crc kubenswrapper[4807]: E1127 11:23:17.623328 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" podUID="c5b9cfda-ea17-4add-a121-036a989efeab" Nov 27 11:23:17 crc kubenswrapper[4807]: E1127 11:23:17.651921 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7xgp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-589cbd6b5b-v9d6j_openstack-operators(5ae030e1-b973-4137-abd1-1abc5f5d1153): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:17 crc kubenswrapper[4807]: E1127 11:23:17.657342 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" podUID="5ae030e1-b973-4137-abd1-1abc5f5d1153" Nov 27 11:23:17 crc kubenswrapper[4807]: E1127 11:23:17.660338 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lnzw9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-66f4dd4bc7-5wb7l_openstack-operators(fe4ff55b-a2dd-4936-9016-d73ade2388a0): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 27 11:23:17 crc kubenswrapper[4807]: E1127 11:23:17.662017 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" podUID="fe4ff55b-a2dd-4936-9016-d73ade2388a0" Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.145699 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" event={"ID":"623644bf-2d87-4689-acea-cfaeca90285f","Type":"ContainerStarted","Data":"70ce35b298372d33630c2660a1ef903b00d74a2a90161358a03ef79cd073fd5a"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.177517 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" event={"ID":"3ae6d3a5-8999-4c3d-a3de-b497ae0776f2","Type":"ContainerStarted","Data":"1b0b0a35b87fd5a09bc4688944a9c84070270b28e06e9ba18204f7a09e7737df"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.217104 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" event={"ID":"7377040f-fbf5-4395-a903-99dbb10dbcac","Type":"ContainerStarted","Data":"334f9fdbaff29410a5b2a99933902bc540ebd9cffc7472d4a91c68e4c9480e86"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.248128 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" event={"ID":"4ae17b3e-8de9-45e3-8404-2f2fda6c6b99","Type":"ContainerStarted","Data":"c508c071f2437c6b776cb517684203510435accdb977abbeb80f43013c18e9db"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.268465 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" event={"ID":"574b2edd-5058-4d84-a8b8-72258c3c9f7b","Type":"ContainerStarted","Data":"202bf37eb5a69326b74316e31f6cb4f077e89994c6ba4ad7e2ecb5728c20706f"} Nov 27 11:23:18 crc kubenswrapper[4807]: E1127 11:23:18.272134 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" podUID="574b2edd-5058-4d84-a8b8-72258c3c9f7b" Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.275740 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" event={"ID":"961001c9-3719-4306-8d38-b3c5d8e202bc","Type":"ContainerStarted","Data":"56f1f69d1ef55f562aafe97c2f32385c7b452188c1e331d94a7efcdf6e7e3037"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.281447 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" event={"ID":"dcfd531a-2394-41c7-b05a-5b8e95f8459c","Type":"ContainerStarted","Data":"cf29723f79759e815f020e604d15f88b7b8b4d9b0fceca7330cf43d57ca69178"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.301111 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" event={"ID":"af2f67ab-040b-4ec1-bf21-db83dcaeb6d2","Type":"ContainerStarted","Data":"b53c3ab965b1a298aa9305be43006f53c99b024cae23e9a32714488f927f248c"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.306808 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" event={"ID":"9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b","Type":"ContainerStarted","Data":"0fe66e9d0a2bb05140881f076c0e1754e031e0466de71b8c39764009d7841003"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.326293 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" event={"ID":"fe4ff55b-a2dd-4936-9016-d73ade2388a0","Type":"ContainerStarted","Data":"a983716e2f66dcdd3a21b0bd579d7d34f04927b30a3ceec22aef2df5c2144a07"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.326476 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" Nov 27 11:23:18 crc kubenswrapper[4807]: E1127 11:23:18.329807 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" podUID="fe4ff55b-a2dd-4936-9016-d73ade2388a0" Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.332473 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" event={"ID":"7fbca001-90e9-4da2-bd14-6bc00a48ed40","Type":"ContainerStarted","Data":"51be912eb919236d672af5cb00c63c42c233b1abaef5244cf19af02dc21b0ba6"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.341125 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" event={"ID":"787c342b-413f-495a-8b31-bd8a01f35c3a","Type":"ContainerStarted","Data":"cf84313bc91c3e366248496f2a00cf525c953242eb749c291739f6b57f23a1d2"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.350399 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" event={"ID":"5ae030e1-b973-4137-abd1-1abc5f5d1153","Type":"ContainerStarted","Data":"a564a3bc1ce8969ea31895e59936997f914f7107fd08a21f9ea41166aa42cf05"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.350886 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" Nov 27 11:23:18 crc kubenswrapper[4807]: E1127 11:23:18.353419 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" podUID="5ae030e1-b973-4137-abd1-1abc5f5d1153" Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.358199 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" event={"ID":"c5b9cfda-ea17-4add-a121-036a989efeab","Type":"ContainerStarted","Data":"aae109db03f53f98b9067284073a8153b86ac310899f09d719a37fb64ef0242d"} Nov 27 11:23:18 crc kubenswrapper[4807]: I1127 11:23:18.358447 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" Nov 27 11:23:18 crc kubenswrapper[4807]: E1127 11:23:18.360423 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" podUID="c5b9cfda-ea17-4add-a121-036a989efeab" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.307868 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.315845 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4be44e13-06b8-494e-8a62-7e8d8747692f-cert\") pod \"infra-operator-controller-manager-57548d458d-qg4bq\" (UID: \"4be44e13-06b8-494e-8a62-7e8d8747692f\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.364902 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" Nov 27 11:23:19 crc kubenswrapper[4807]: E1127 11:23:19.366789 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" podUID="fe4ff55b-a2dd-4936-9016-d73ade2388a0" Nov 27 11:23:19 crc kubenswrapper[4807]: E1127 11:23:19.366956 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" podUID="5ae030e1-b973-4137-abd1-1abc5f5d1153" Nov 27 11:23:19 crc kubenswrapper[4807]: E1127 11:23:19.368220 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" podUID="574b2edd-5058-4d84-a8b8-72258c3c9f7b" Nov 27 11:23:19 crc kubenswrapper[4807]: E1127 11:23:19.368281 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" podUID="c5b9cfda-ea17-4add-a121-036a989efeab" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.510438 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.519090 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0bde1253-53c0-4864-b22e-dcf25751388e-cert\") pod \"openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2\" (UID: \"0bde1253-53c0-4864-b22e-dcf25751388e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.573672 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9s7zc" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.573868 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.602007 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-tqrrt" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.610023 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.814350 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.814508 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.819157 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-metrics-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:19 crc kubenswrapper[4807]: I1127 11:23:19.838979 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/649aedb9-ad77-47fa-a7e9-89cb12c65928-webhook-certs\") pod \"openstack-operator-controller-manager-6456fcdb48-tjnrt\" (UID: \"649aedb9-ad77-47fa-a7e9-89cb12c65928\") " pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:20 crc kubenswrapper[4807]: I1127 11:23:20.140928 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-f8df9" Nov 27 11:23:20 crc kubenswrapper[4807]: I1127 11:23:20.148286 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:20 crc kubenswrapper[4807]: I1127 11:23:20.296932 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq"] Nov 27 11:23:20 crc kubenswrapper[4807]: E1127 11:23:20.372557 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" podUID="574b2edd-5058-4d84-a8b8-72258c3c9f7b" Nov 27 11:23:20 crc kubenswrapper[4807]: I1127 11:23:20.451000 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2"] Nov 27 11:23:20 crc kubenswrapper[4807]: I1127 11:23:20.921423 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:23:20 crc kubenswrapper[4807]: I1127 11:23:20.921492 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:23:20 crc kubenswrapper[4807]: I1127 11:23:20.921542 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:23:20 crc kubenswrapper[4807]: I1127 11:23:20.922200 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5c16a271e2f512c2b0b496bddb1e050219d71c041d2a908668448dc5280aeab0"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:23:20 crc kubenswrapper[4807]: I1127 11:23:20.922296 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://5c16a271e2f512c2b0b496bddb1e050219d71c041d2a908668448dc5280aeab0" gracePeriod=600 Nov 27 11:23:20 crc kubenswrapper[4807]: W1127 11:23:20.943578 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bde1253_53c0_4864_b22e_dcf25751388e.slice/crio-ca55d9a191883c133cb0a11a74d4143aad29d0536474c0f60cc5fc3af2ec427f WatchSource:0}: Error finding container ca55d9a191883c133cb0a11a74d4143aad29d0536474c0f60cc5fc3af2ec427f: Status 404 returned error can't find the container with id ca55d9a191883c133cb0a11a74d4143aad29d0536474c0f60cc5fc3af2ec427f Nov 27 11:23:21 crc kubenswrapper[4807]: I1127 11:23:21.382540 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="5c16a271e2f512c2b0b496bddb1e050219d71c041d2a908668448dc5280aeab0" exitCode=0 Nov 27 11:23:21 crc kubenswrapper[4807]: I1127 11:23:21.382618 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"5c16a271e2f512c2b0b496bddb1e050219d71c041d2a908668448dc5280aeab0"} Nov 27 11:23:21 crc kubenswrapper[4807]: I1127 11:23:21.382970 4807 scope.go:117] "RemoveContainer" containerID="950bd2e48636b06df84e0002296e518c85524a06e8b0c7352cd93856ad7f71ef" Nov 27 11:23:21 crc kubenswrapper[4807]: I1127 11:23:21.385134 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" event={"ID":"0bde1253-53c0-4864-b22e-dcf25751388e","Type":"ContainerStarted","Data":"ca55d9a191883c133cb0a11a74d4143aad29d0536474c0f60cc5fc3af2ec427f"} Nov 27 11:23:21 crc kubenswrapper[4807]: I1127 11:23:21.387216 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" event={"ID":"4be44e13-06b8-494e-8a62-7e8d8747692f","Type":"ContainerStarted","Data":"8024d6812972a38cbc2b17bc0e6a5c849eaf6c13291f866c4ccba135db544bd4"} Nov 27 11:23:23 crc kubenswrapper[4807]: I1127 11:23:23.541581 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" Nov 27 11:23:23 crc kubenswrapper[4807]: E1127 11:23:23.544687 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" podUID="574b2edd-5058-4d84-a8b8-72258c3c9f7b" Nov 27 11:23:23 crc kubenswrapper[4807]: I1127 11:23:23.552312 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" Nov 27 11:23:23 crc kubenswrapper[4807]: E1127 11:23:23.555693 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" podUID="5ae030e1-b973-4137-abd1-1abc5f5d1153" Nov 27 11:23:23 crc kubenswrapper[4807]: I1127 11:23:23.584854 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" Nov 27 11:23:23 crc kubenswrapper[4807]: E1127 11:23:23.586878 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" podUID="c5b9cfda-ea17-4add-a121-036a989efeab" Nov 27 11:23:23 crc kubenswrapper[4807]: I1127 11:23:23.715190 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" Nov 27 11:23:23 crc kubenswrapper[4807]: E1127 11:23:23.717032 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" podUID="fe4ff55b-a2dd-4936-9016-d73ade2388a0" Nov 27 11:23:25 crc kubenswrapper[4807]: I1127 11:23:25.309483 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt"] Nov 27 11:23:39 crc kubenswrapper[4807]: E1127 11:23:39.416926 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4" Nov 27 11:23:39 crc kubenswrapper[4807]: E1127 11:23:39.417900 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rg6wc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-d77b94747-pd7pj_openstack-operators(94062b2f-3f5a-404d-9b0a-8b7f858e1322): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:23:40 crc kubenswrapper[4807]: E1127 11:23:40.013818 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385" Nov 27 11:23:40 crc kubenswrapper[4807]: E1127 11:23:40.015474 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lr2vp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-svq2j_openstack-operators(5b554316-8e33-4fa8-a340-91d9e0f6b0de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:23:40 crc kubenswrapper[4807]: E1127 11:23:40.031588 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/systemd-tmpfiles-clean.service\": RecentStats: unable to find data in memory cache]" Nov 27 11:23:40 crc kubenswrapper[4807]: I1127 11:23:40.566573 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" event={"ID":"8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91","Type":"ContainerStarted","Data":"8a37f083fe76ac9b911a2cb200a70e1196dcf5dde7c75df800393a82c54aa2b9"} Nov 27 11:23:40 crc kubenswrapper[4807]: I1127 11:23:40.577701 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" event={"ID":"649aedb9-ad77-47fa-a7e9-89cb12c65928","Type":"ContainerStarted","Data":"91b0332612cd28e7b1d5e2c530503304dab44c011a6e41da507dc8c0d79aa669"} Nov 27 11:23:40 crc kubenswrapper[4807]: I1127 11:23:40.577793 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" event={"ID":"649aedb9-ad77-47fa-a7e9-89cb12c65928","Type":"ContainerStarted","Data":"877592b4c329f7fcc8cbab4010854e9ce49c92eea10664c1da6846c7e6a8c45b"} Nov 27 11:23:40 crc kubenswrapper[4807]: I1127 11:23:40.577838 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:40 crc kubenswrapper[4807]: I1127 11:23:40.582785 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" event={"ID":"13749acc-f727-4c3a-b24a-b56bd6b7533d","Type":"ContainerStarted","Data":"422e884aeb7649965061b9d50b67cebb2ca2edcf96e1e9b20f5fbfa9b46ca842"} Nov 27 11:23:40 crc kubenswrapper[4807]: I1127 11:23:40.585615 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-l72vc" podStartSLOduration=2.484386364 podStartE2EDuration="37.585598716s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:05.012654363 +0000 UTC m=+826.112152561" lastFinishedPulling="2025-11-27 11:23:40.113866715 +0000 UTC m=+861.213364913" observedRunningTime="2025-11-27 11:23:40.580915242 +0000 UTC m=+861.680413440" watchObservedRunningTime="2025-11-27 11:23:40.585598716 +0000 UTC m=+861.685096914" Nov 27 11:23:40 crc kubenswrapper[4807]: I1127 11:23:40.586750 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" event={"ID":"1b313bc2-c896-486c-a520-9843ec7bd6ad","Type":"ContainerStarted","Data":"2bf3db061cdc4d3112ddd603959f3338e6ff28b875704132cf600d8bd47adf15"} Nov 27 11:23:40 crc kubenswrapper[4807]: I1127 11:23:40.606343 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" podStartSLOduration=37.606328863 podStartE2EDuration="37.606328863s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:23:40.604151786 +0000 UTC m=+861.703649994" watchObservedRunningTime="2025-11-27 11:23:40.606328863 +0000 UTC m=+861.705827061" Nov 27 11:23:40 crc kubenswrapper[4807]: I1127 11:23:40.607140 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"f949ac50efe1cb33ac8f9f8fad96e486a8238fcb507e2fef2a39dd8e43ee4952"} Nov 27 11:23:40 crc kubenswrapper[4807]: I1127 11:23:40.612497 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" event={"ID":"a0e8d2a3-0f58-4a1d-9867-648001196d2e","Type":"ContainerStarted","Data":"b22fb6c63829b031e537941ad9e10dc128f319813fcba67a9dc0ee867175e301"} Nov 27 11:23:41 crc kubenswrapper[4807]: E1127 11:23:41.055482 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" podUID="5b554316-8e33-4fa8-a340-91d9e0f6b0de" Nov 27 11:23:41 crc kubenswrapper[4807]: E1127 11:23:41.496972 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" podUID="94062b2f-3f5a-404d-9b0a-8b7f858e1322" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.639735 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" event={"ID":"3ae6d3a5-8999-4c3d-a3de-b497ae0776f2","Type":"ContainerStarted","Data":"00e97cf0740f9bf97cfead1cb6bf0fbeebee264718d6ba523282c3c0bd1652b9"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.640820 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.642609 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.657742 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" event={"ID":"c5b9cfda-ea17-4add-a121-036a989efeab","Type":"ContainerStarted","Data":"7c1e44bb2a7f72e53c1835e1401401ce979e041e31bee90990533328bd5cad24"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.668389 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" event={"ID":"4be44e13-06b8-494e-8a62-7e8d8747692f","Type":"ContainerStarted","Data":"4099cd89e91a876cc35616eaea50ece8fb0b76fb8aa68bddf6a7e2256536dca0"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.668661 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.679639 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5d499bf58b-hcnzc" podStartSLOduration=3.152673153 podStartE2EDuration="38.679623183s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.58783412 +0000 UTC m=+825.687332318" lastFinishedPulling="2025-11-27 11:23:40.11478415 +0000 UTC m=+861.214282348" observedRunningTime="2025-11-27 11:23:41.677149728 +0000 UTC m=+862.776647946" watchObservedRunningTime="2025-11-27 11:23:41.679623183 +0000 UTC m=+862.779121381" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.681290 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" event={"ID":"fe4ff55b-a2dd-4936-9016-d73ade2388a0","Type":"ContainerStarted","Data":"74b91a1f2740a9a94327244c62a6e8c856e14ccf76aebe04bf6c5a789fcac7ce"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.686350 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" event={"ID":"4ae17b3e-8de9-45e3-8404-2f2fda6c6b99","Type":"ContainerStarted","Data":"979cc39404d47d3a53028e4e8428bf851536e9094b93f645ccd017f9ee5d5b48"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.688304 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.688720 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.691685 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" event={"ID":"94062b2f-3f5a-404d-9b0a-8b7f858e1322","Type":"ContainerStarted","Data":"48844391a639b5a43a1ac63d28c836fd8e791fe87aeea584ee36420d525bfc28"} Nov 27 11:23:41 crc kubenswrapper[4807]: E1127 11:23:41.707578 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" podUID="94062b2f-3f5a-404d-9b0a-8b7f858e1322" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.708328 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5b77f656f-j2tq6" podStartSLOduration=26.058846098 podStartE2EDuration="38.708305701s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.403010808 +0000 UTC m=+825.502509006" lastFinishedPulling="2025-11-27 11:23:17.052470411 +0000 UTC m=+838.151968609" observedRunningTime="2025-11-27 11:23:41.707871659 +0000 UTC m=+862.807369857" watchObservedRunningTime="2025-11-27 11:23:41.708305701 +0000 UTC m=+862.807803889" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.709319 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" event={"ID":"5ae030e1-b973-4137-abd1-1abc5f5d1153","Type":"ContainerStarted","Data":"36c699469d2ec08f4adc3c9cdb53437916c99da565bc19e3d4d84ff3fcb35a2a"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.713173 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" event={"ID":"a0e8d2a3-0f58-4a1d-9867-648001196d2e","Type":"ContainerStarted","Data":"1744204e6f1d35a56847a1bf9c83761d2e1d4a64920ad2971cdf734424d952fe"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.713882 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.715934 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" event={"ID":"13749acc-f727-4c3a-b24a-b56bd6b7533d","Type":"ContainerStarted","Data":"fbaace0d504f3a229c04463667e93680258d0c6461f7b70952a10c32cbf82fa4"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.716151 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.718113 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" event={"ID":"961001c9-3719-4306-8d38-b3c5d8e202bc","Type":"ContainerStarted","Data":"b7721799b57fcdf51fcdeff062e134a557c231a71ac9b945d4369dcc2958f3d7"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.718348 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.720090 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" event={"ID":"dcfd531a-2394-41c7-b05a-5b8e95f8459c","Type":"ContainerStarted","Data":"9da8cb2796cdb3acd82ff44a6f2fb541ff387c91e6343be2d6f4b5ac30121084"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.723398 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.726521 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.737522 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.750145 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" podStartSLOduration=19.571355155 podStartE2EDuration="38.750107785s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:20.935156896 +0000 UTC m=+842.034655094" lastFinishedPulling="2025-11-27 11:23:40.113909526 +0000 UTC m=+861.213407724" observedRunningTime="2025-11-27 11:23:41.740885191 +0000 UTC m=+862.840383399" watchObservedRunningTime="2025-11-27 11:23:41.750107785 +0000 UTC m=+862.849605983" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.759271 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" event={"ID":"0bde1253-53c0-4864-b22e-dcf25751388e","Type":"ContainerStarted","Data":"9a4d4bf8ddd328b4c5762c568aeddd708da6b18e0797e0b326f5c85ec8917ee0"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.759315 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" event={"ID":"0bde1253-53c0-4864-b22e-dcf25751388e","Type":"ContainerStarted","Data":"f8406e5af253cd8c7ab50b29368b7e241115a5413b3bc1d4e70f0dde6c9613b8"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.761305 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.793300 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" event={"ID":"af2f67ab-040b-4ec1-bf21-db83dcaeb6d2","Type":"ContainerStarted","Data":"c53797cfc7a0204ba0b42b6138f47356b867a061b32a888e14dc95db9cdfceef"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.794676 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.800568 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.805654 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" event={"ID":"9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b","Type":"ContainerStarted","Data":"ec13addb59b7f9b429ecda6ad9baae667f24a7e50b7c385dede90f5a4518ec13"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.806399 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.810700 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-66f4dd4bc7-5wb7l" podStartSLOduration=26.332723929 podStartE2EDuration="38.810681585s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.57471522 +0000 UTC m=+825.674213418" lastFinishedPulling="2025-11-27 11:23:17.052672876 +0000 UTC m=+838.152171074" observedRunningTime="2025-11-27 11:23:41.807132471 +0000 UTC m=+862.906630669" watchObservedRunningTime="2025-11-27 11:23:41.810681585 +0000 UTC m=+862.910179783" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.814932 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" event={"ID":"7377040f-fbf5-4395-a903-99dbb10dbcac","Type":"ContainerStarted","Data":"492cc585e2513dac64225f062a9258d29657f08c43854100b019122aec817bf8"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.816065 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.817467 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.838522 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" event={"ID":"7fbca001-90e9-4da2-bd14-6bc00a48ed40","Type":"ContainerStarted","Data":"32747ac42a0a40bf2925fac0131fb439c65b52ac35087f98679fc5bebb962589"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.839447 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.839611 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.852662 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.853236 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" podStartSLOduration=18.493297561 podStartE2EDuration="38.853216088s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.867478981 +0000 UTC m=+825.966977179" lastFinishedPulling="2025-11-27 11:23:25.227397508 +0000 UTC m=+846.326895706" observedRunningTime="2025-11-27 11:23:41.838676134 +0000 UTC m=+862.938174322" watchObservedRunningTime="2025-11-27 11:23:41.853216088 +0000 UTC m=+862.952714286" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.876029 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" event={"ID":"574b2edd-5058-4d84-a8b8-72258c3c9f7b","Type":"ContainerStarted","Data":"b631f618f1906cd512b4047390a5e9013c429021eae1e727828878b70f0f03c9"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.892388 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6fdcddb789-xfrls" podStartSLOduration=3.491939539 podStartE2EDuration="38.892367313s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.714022927 +0000 UTC m=+825.813521125" lastFinishedPulling="2025-11-27 11:23:40.114450711 +0000 UTC m=+861.213948899" observedRunningTime="2025-11-27 11:23:41.866543471 +0000 UTC m=+862.966041669" watchObservedRunningTime="2025-11-27 11:23:41.892367313 +0000 UTC m=+862.991865511" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.915887 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" event={"ID":"1b313bc2-c896-486c-a520-9843ec7bd6ad","Type":"ContainerStarted","Data":"f91d64e6066661188f5982a445269ba570acb27116e4ae4004d22b9b17ad32a6"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.915927 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.919607 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" event={"ID":"5b554316-8e33-4fa8-a340-91d9e0f6b0de","Type":"ContainerStarted","Data":"9d8316d2842567b6bfaced1f7c7b7d32057cca662de4f6c086775c219196089d"} Nov 27 11:23:41 crc kubenswrapper[4807]: E1127 11:23:41.921020 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" podUID="5b554316-8e33-4fa8-a340-91d9e0f6b0de" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.970351 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" event={"ID":"623644bf-2d87-4689-acea-cfaeca90285f","Type":"ContainerStarted","Data":"9da9be3e42f5eb53c917fbff9065bcc4cc1c8863c9c722787a77f895edda8d13"} Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.971438 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.979051 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" Nov 27 11:23:41 crc kubenswrapper[4807]: I1127 11:23:41.987028 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:41.993846 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.042731 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" podStartSLOduration=19.832159126 podStartE2EDuration="39.042712094s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:20.947005262 +0000 UTC m=+842.046503470" lastFinishedPulling="2025-11-27 11:23:40.15755823 +0000 UTC m=+861.257056438" observedRunningTime="2025-11-27 11:23:41.970024034 +0000 UTC m=+863.069522252" watchObservedRunningTime="2025-11-27 11:23:42.042712094 +0000 UTC m=+863.142210292" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.043157 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" podStartSLOduration=18.810846661 podStartE2EDuration="39.043152476s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.872599338 +0000 UTC m=+825.972097536" lastFinishedPulling="2025-11-27 11:23:25.104905153 +0000 UTC m=+846.204403351" observedRunningTime="2025-11-27 11:23:41.995337302 +0000 UTC m=+863.094835510" watchObservedRunningTime="2025-11-27 11:23:42.043152476 +0000 UTC m=+863.142650674" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.061034 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-589cbd6b5b-v9d6j" podStartSLOduration=26.413907017 podStartE2EDuration="39.061015507s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.40270398 +0000 UTC m=+825.502202178" lastFinishedPulling="2025-11-27 11:23:17.04981247 +0000 UTC m=+838.149310668" observedRunningTime="2025-11-27 11:23:42.044116191 +0000 UTC m=+863.143614389" watchObservedRunningTime="2025-11-27 11:23:42.061015507 +0000 UTC m=+863.160513705" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.096564 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-79556f57fc-6l4tm" podStartSLOduration=3.820472913 podStartE2EDuration="39.096548696s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.736404074 +0000 UTC m=+825.835902272" lastFinishedPulling="2025-11-27 11:23:40.012479857 +0000 UTC m=+861.111978055" observedRunningTime="2025-11-27 11:23:42.093123265 +0000 UTC m=+863.192621463" watchObservedRunningTime="2025-11-27 11:23:42.096548696 +0000 UTC m=+863.196046894" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.100035 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5d494799bf-fztl6" podStartSLOduration=3.435535778 podStartE2EDuration="39.100027408s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.448387049 +0000 UTC m=+825.547885247" lastFinishedPulling="2025-11-27 11:23:40.112878679 +0000 UTC m=+861.212376877" observedRunningTime="2025-11-27 11:23:42.075590332 +0000 UTC m=+863.175088540" watchObservedRunningTime="2025-11-27 11:23:42.100027408 +0000 UTC m=+863.199525606" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.125318 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5cd6c7f4c8-mw4mw" podStartSLOduration=4.023026935 podStartE2EDuration="39.125293985s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:05.011518263 +0000 UTC m=+826.111016461" lastFinishedPulling="2025-11-27 11:23:40.113785313 +0000 UTC m=+861.213283511" observedRunningTime="2025-11-27 11:23:42.115723582 +0000 UTC m=+863.215221780" watchObservedRunningTime="2025-11-27 11:23:42.125293985 +0000 UTC m=+863.224792183" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.148768 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" podStartSLOduration=18.761161399 podStartE2EDuration="39.148751465s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.765206402 +0000 UTC m=+825.864704600" lastFinishedPulling="2025-11-27 11:23:25.152796478 +0000 UTC m=+846.252294666" observedRunningTime="2025-11-27 11:23:42.147640106 +0000 UTC m=+863.247138294" watchObservedRunningTime="2025-11-27 11:23:42.148751465 +0000 UTC m=+863.248249663" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.170695 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-955677c94-5wfgl" podStartSLOduration=26.486436414 podStartE2EDuration="39.170676084s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.402953077 +0000 UTC m=+825.502451275" lastFinishedPulling="2025-11-27 11:23:17.087192747 +0000 UTC m=+838.186690945" observedRunningTime="2025-11-27 11:23:42.165682412 +0000 UTC m=+863.265180610" watchObservedRunningTime="2025-11-27 11:23:42.170676084 +0000 UTC m=+863.270174282" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.239910 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6b7f75547b-6sqtw" podStartSLOduration=3.311194603 podStartE2EDuration="39.239892472s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.185656929 +0000 UTC m=+825.285155127" lastFinishedPulling="2025-11-27 11:23:40.114354798 +0000 UTC m=+861.213852996" observedRunningTime="2025-11-27 11:23:42.21256205 +0000 UTC m=+863.312060248" watchObservedRunningTime="2025-11-27 11:23:42.239892472 +0000 UTC m=+863.339390670" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.274364 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" podStartSLOduration=3.920981264 podStartE2EDuration="39.274347573s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.760527587 +0000 UTC m=+825.860025805" lastFinishedPulling="2025-11-27 11:23:40.113893896 +0000 UTC m=+861.213392114" observedRunningTime="2025-11-27 11:23:42.239454071 +0000 UTC m=+863.338952269" watchObservedRunningTime="2025-11-27 11:23:42.274347573 +0000 UTC m=+863.373845771" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.274493 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7b4567c7cf-lz6lg" podStartSLOduration=3.74715154 podStartE2EDuration="39.274489316s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.566445409 +0000 UTC m=+825.665943607" lastFinishedPulling="2025-11-27 11:23:40.093783185 +0000 UTC m=+861.193281383" observedRunningTime="2025-11-27 11:23:42.271564769 +0000 UTC m=+863.371062967" watchObservedRunningTime="2025-11-27 11:23:42.274489316 +0000 UTC m=+863.373987514" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.320533 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-67cb4dc6d4-xtdbs" podStartSLOduration=3.971481159 podStartE2EDuration="39.320507242s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.763877007 +0000 UTC m=+825.863375205" lastFinishedPulling="2025-11-27 11:23:40.11290309 +0000 UTC m=+861.212401288" observedRunningTime="2025-11-27 11:23:42.30149359 +0000 UTC m=+863.400991798" watchObservedRunningTime="2025-11-27 11:23:42.320507242 +0000 UTC m=+863.420005440" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.329304 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b64f4fb85-z6w5s" podStartSLOduration=3.318634038 podStartE2EDuration="39.329281374s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.103159788 +0000 UTC m=+825.202657986" lastFinishedPulling="2025-11-27 11:23:40.113807124 +0000 UTC m=+861.213305322" observedRunningTime="2025-11-27 11:23:42.321297443 +0000 UTC m=+863.420795651" watchObservedRunningTime="2025-11-27 11:23:42.329281374 +0000 UTC m=+863.428779572" Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.986404 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-56897c768d-d9mj8" event={"ID":"787c342b-413f-495a-8b31-bd8a01f35c3a","Type":"ContainerStarted","Data":"e61b5c3b98f28b0a445b1def52a2d877a3bbbd7572f7b2dbc780d8e8e7ae8312"} Nov 27 11:23:42 crc kubenswrapper[4807]: I1127 11:23:42.987854 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" event={"ID":"4be44e13-06b8-494e-8a62-7e8d8747692f","Type":"ContainerStarted","Data":"382ac84cbbc5de42f72478815c9b12be696f8138d9673891d93506855bdd5560"} Nov 27 11:23:49 crc kubenswrapper[4807]: I1127 11:23:49.581719 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-qg4bq" Nov 27 11:23:49 crc kubenswrapper[4807]: I1127 11:23:49.625166 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2" Nov 27 11:23:50 crc kubenswrapper[4807]: I1127 11:23:50.153390 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6456fcdb48-tjnrt" Nov 27 11:23:52 crc kubenswrapper[4807]: E1127 11:23:52.534550 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72236301580ff9080f7e311b832d7ba66666a9afeda51f969745229624ff26e4\\\"\"" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" podUID="94062b2f-3f5a-404d-9b0a-8b7f858e1322" Nov 27 11:23:53 crc kubenswrapper[4807]: I1127 11:23:53.932332 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-64cdc6ff96-k82xf" Nov 27 11:23:54 crc kubenswrapper[4807]: I1127 11:23:54.048363 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57988cc5b5-cw9bt" Nov 27 11:23:54 crc kubenswrapper[4807]: I1127 11:23:54.224123 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-656dcb59d4-tj4db" Nov 27 11:23:57 crc kubenswrapper[4807]: E1127 11:23:57.534607 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" podUID="5b554316-8e33-4fa8-a340-91d9e0f6b0de" Nov 27 11:24:09 crc kubenswrapper[4807]: I1127 11:24:09.163124 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" event={"ID":"94062b2f-3f5a-404d-9b0a-8b7f858e1322","Type":"ContainerStarted","Data":"f33faa3f26d7d48577ca717a9af21b8aa40b8947b6694540d401d40c390be6fd"} Nov 27 11:24:09 crc kubenswrapper[4807]: I1127 11:24:09.163891 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" Nov 27 11:24:09 crc kubenswrapper[4807]: I1127 11:24:09.177314 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" podStartSLOduration=3.018513361 podStartE2EDuration="1m6.177301496s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.852105611 +0000 UTC m=+825.951603829" lastFinishedPulling="2025-11-27 11:24:08.010893766 +0000 UTC m=+889.110391964" observedRunningTime="2025-11-27 11:24:09.176804203 +0000 UTC m=+890.276302391" watchObservedRunningTime="2025-11-27 11:24:09.177301496 +0000 UTC m=+890.276799704" Nov 27 11:24:11 crc kubenswrapper[4807]: I1127 11:24:11.534511 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 11:24:13 crc kubenswrapper[4807]: I1127 11:24:13.190394 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" event={"ID":"5b554316-8e33-4fa8-a340-91d9e0f6b0de","Type":"ContainerStarted","Data":"cbee06ab06df72875ab74750e8418a3db9874fc337fcbe33bc02ebf9e5304beb"} Nov 27 11:24:13 crc kubenswrapper[4807]: I1127 11:24:13.190916 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" Nov 27 11:24:13 crc kubenswrapper[4807]: I1127 11:24:13.232467 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" podStartSLOduration=2.961563439 podStartE2EDuration="1m10.23244557s" podCreationTimestamp="2025-11-27 11:23:03 +0000 UTC" firstStartedPulling="2025-11-27 11:23:04.863049533 +0000 UTC m=+825.962547751" lastFinishedPulling="2025-11-27 11:24:12.133931684 +0000 UTC m=+893.233429882" observedRunningTime="2025-11-27 11:24:13.229128792 +0000 UTC m=+894.328626990" watchObservedRunningTime="2025-11-27 11:24:13.23244557 +0000 UTC m=+894.331943768" Nov 27 11:24:14 crc kubenswrapper[4807]: I1127 11:24:14.103718 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d77b94747-pd7pj" Nov 27 11:24:24 crc kubenswrapper[4807]: I1127 11:24:24.130465 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-svq2j" Nov 27 11:24:30 crc kubenswrapper[4807]: I1127 11:24:30.956690 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p4xz7"] Nov 27 11:24:30 crc kubenswrapper[4807]: I1127 11:24:30.958656 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:30 crc kubenswrapper[4807]: I1127 11:24:30.969492 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4xz7"] Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.067870 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-utilities\") pod \"redhat-operators-p4xz7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.067967 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsvr5\" (UniqueName: \"kubernetes.io/projected/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-kube-api-access-rsvr5\") pod \"redhat-operators-p4xz7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.068056 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-catalog-content\") pod \"redhat-operators-p4xz7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.169527 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-utilities\") pod \"redhat-operators-p4xz7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.169605 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsvr5\" (UniqueName: \"kubernetes.io/projected/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-kube-api-access-rsvr5\") pod \"redhat-operators-p4xz7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.169661 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-catalog-content\") pod \"redhat-operators-p4xz7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.170179 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-catalog-content\") pod \"redhat-operators-p4xz7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.170355 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-utilities\") pod \"redhat-operators-p4xz7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.198346 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsvr5\" (UniqueName: \"kubernetes.io/projected/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-kube-api-access-rsvr5\") pod \"redhat-operators-p4xz7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.278447 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:31 crc kubenswrapper[4807]: I1127 11:24:31.758329 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4xz7"] Nov 27 11:24:31 crc kubenswrapper[4807]: W1127 11:24:31.761206 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04cf0017_8ab1_4a1a_848e_9bbbdd3d36e7.slice/crio-a99f9b1390513c768f25ca2ddb7a96c817c47784f418ef927815d2f428c2117b WatchSource:0}: Error finding container a99f9b1390513c768f25ca2ddb7a96c817c47784f418ef927815d2f428c2117b: Status 404 returned error can't find the container with id a99f9b1390513c768f25ca2ddb7a96c817c47784f418ef927815d2f428c2117b Nov 27 11:24:32 crc kubenswrapper[4807]: I1127 11:24:32.315454 4807 generic.go:334] "Generic (PLEG): container finished" podID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerID="940b0267ddbbf2d33a7f2b391875a4978edaa658bcf0f1cfdc2c9852c90fb598" exitCode=0 Nov 27 11:24:32 crc kubenswrapper[4807]: I1127 11:24:32.315522 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4xz7" event={"ID":"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7","Type":"ContainerDied","Data":"940b0267ddbbf2d33a7f2b391875a4978edaa658bcf0f1cfdc2c9852c90fb598"} Nov 27 11:24:32 crc kubenswrapper[4807]: I1127 11:24:32.315566 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4xz7" event={"ID":"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7","Type":"ContainerStarted","Data":"a99f9b1390513c768f25ca2ddb7a96c817c47784f418ef927815d2f428c2117b"} Nov 27 11:24:34 crc kubenswrapper[4807]: I1127 11:24:34.329364 4807 generic.go:334] "Generic (PLEG): container finished" podID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerID="33021570d0f703402e59a2914746f48142ec5e9f8f7908b32ca4f7662b1b8467" exitCode=0 Nov 27 11:24:34 crc kubenswrapper[4807]: I1127 11:24:34.329441 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4xz7" event={"ID":"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7","Type":"ContainerDied","Data":"33021570d0f703402e59a2914746f48142ec5e9f8f7908b32ca4f7662b1b8467"} Nov 27 11:24:35 crc kubenswrapper[4807]: I1127 11:24:35.338663 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4xz7" event={"ID":"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7","Type":"ContainerStarted","Data":"e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425"} Nov 27 11:24:35 crc kubenswrapper[4807]: I1127 11:24:35.363341 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p4xz7" podStartSLOduration=2.872751774 podStartE2EDuration="5.363317551s" podCreationTimestamp="2025-11-27 11:24:30 +0000 UTC" firstStartedPulling="2025-11-27 11:24:32.316706597 +0000 UTC m=+913.416204785" lastFinishedPulling="2025-11-27 11:24:34.807272364 +0000 UTC m=+915.906770562" observedRunningTime="2025-11-27 11:24:35.355470244 +0000 UTC m=+916.454968462" watchObservedRunningTime="2025-11-27 11:24:35.363317551 +0000 UTC m=+916.462815759" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.199657 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8p2mg"] Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.201016 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.205774 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.206291 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.206559 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.205825 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-gffss" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.213997 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8p2mg"] Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.264328 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7pdn\" (UniqueName: \"kubernetes.io/projected/e52e0389-6f80-456a-8763-e813b6eab09c-kube-api-access-h7pdn\") pod \"dnsmasq-dns-675f4bcbfc-8p2mg\" (UID: \"e52e0389-6f80-456a-8763-e813b6eab09c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.264415 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52e0389-6f80-456a-8763-e813b6eab09c-config\") pod \"dnsmasq-dns-675f4bcbfc-8p2mg\" (UID: \"e52e0389-6f80-456a-8763-e813b6eab09c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.304381 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vq5wc"] Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.305791 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.309597 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.320725 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vq5wc"] Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.365753 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-config\") pod \"dnsmasq-dns-78dd6ddcc-vq5wc\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.365804 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7pdn\" (UniqueName: \"kubernetes.io/projected/e52e0389-6f80-456a-8763-e813b6eab09c-kube-api-access-h7pdn\") pod \"dnsmasq-dns-675f4bcbfc-8p2mg\" (UID: \"e52e0389-6f80-456a-8763-e813b6eab09c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.365949 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vq5wc\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.366136 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52e0389-6f80-456a-8763-e813b6eab09c-config\") pod \"dnsmasq-dns-675f4bcbfc-8p2mg\" (UID: \"e52e0389-6f80-456a-8763-e813b6eab09c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.366226 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc7c9\" (UniqueName: \"kubernetes.io/projected/3b90b03e-51e6-4862-acff-02a1b895a10e-kube-api-access-vc7c9\") pod \"dnsmasq-dns-78dd6ddcc-vq5wc\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.367269 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52e0389-6f80-456a-8763-e813b6eab09c-config\") pod \"dnsmasq-dns-675f4bcbfc-8p2mg\" (UID: \"e52e0389-6f80-456a-8763-e813b6eab09c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.383230 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7pdn\" (UniqueName: \"kubernetes.io/projected/e52e0389-6f80-456a-8763-e813b6eab09c-kube-api-access-h7pdn\") pod \"dnsmasq-dns-675f4bcbfc-8p2mg\" (UID: \"e52e0389-6f80-456a-8763-e813b6eab09c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.467743 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc7c9\" (UniqueName: \"kubernetes.io/projected/3b90b03e-51e6-4862-acff-02a1b895a10e-kube-api-access-vc7c9\") pod \"dnsmasq-dns-78dd6ddcc-vq5wc\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.467819 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-config\") pod \"dnsmasq-dns-78dd6ddcc-vq5wc\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.467881 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vq5wc\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.468955 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vq5wc\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.469384 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-config\") pod \"dnsmasq-dns-78dd6ddcc-vq5wc\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.494137 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc7c9\" (UniqueName: \"kubernetes.io/projected/3b90b03e-51e6-4862-acff-02a1b895a10e-kube-api-access-vc7c9\") pod \"dnsmasq-dns-78dd6ddcc-vq5wc\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.529052 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.624262 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.827139 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-whcvm"] Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.830214 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.832914 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whcvm"] Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.977670 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4gjz\" (UniqueName: \"kubernetes.io/projected/d44ca316-19b2-4a04-97c2-a77db9c711b6-kube-api-access-s4gjz\") pod \"certified-operators-whcvm\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.977722 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-catalog-content\") pod \"certified-operators-whcvm\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:38 crc kubenswrapper[4807]: I1127 11:24:38.977765 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-utilities\") pod \"certified-operators-whcvm\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.007422 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8p2mg"] Nov 27 11:24:39 crc kubenswrapper[4807]: W1127 11:24:39.011471 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode52e0389_6f80_456a_8763_e813b6eab09c.slice/crio-0ea4031085fef19f6d4360db5581b902291c2af2237f5e7a9e9d19898365e5fa WatchSource:0}: Error finding container 0ea4031085fef19f6d4360db5581b902291c2af2237f5e7a9e9d19898365e5fa: Status 404 returned error can't find the container with id 0ea4031085fef19f6d4360db5581b902291c2af2237f5e7a9e9d19898365e5fa Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.078773 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4gjz\" (UniqueName: \"kubernetes.io/projected/d44ca316-19b2-4a04-97c2-a77db9c711b6-kube-api-access-s4gjz\") pod \"certified-operators-whcvm\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.078838 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-catalog-content\") pod \"certified-operators-whcvm\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.078902 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-utilities\") pod \"certified-operators-whcvm\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.079487 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-utilities\") pod \"certified-operators-whcvm\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.079577 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-catalog-content\") pod \"certified-operators-whcvm\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.098872 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4gjz\" (UniqueName: \"kubernetes.io/projected/d44ca316-19b2-4a04-97c2-a77db9c711b6-kube-api-access-s4gjz\") pod \"certified-operators-whcvm\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.116764 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vq5wc"] Nov 27 11:24:39 crc kubenswrapper[4807]: W1127 11:24:39.135870 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b90b03e_51e6_4862_acff_02a1b895a10e.slice/crio-227d8a8ba9ffab4ced5fb9ed462676565fa2141b4b7b174120a6578e9fa30e46 WatchSource:0}: Error finding container 227d8a8ba9ffab4ced5fb9ed462676565fa2141b4b7b174120a6578e9fa30e46: Status 404 returned error can't find the container with id 227d8a8ba9ffab4ced5fb9ed462676565fa2141b4b7b174120a6578e9fa30e46 Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.153049 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.365625 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" event={"ID":"e52e0389-6f80-456a-8763-e813b6eab09c","Type":"ContainerStarted","Data":"0ea4031085fef19f6d4360db5581b902291c2af2237f5e7a9e9d19898365e5fa"} Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.366485 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" event={"ID":"3b90b03e-51e6-4862-acff-02a1b895a10e","Type":"ContainerStarted","Data":"227d8a8ba9ffab4ced5fb9ed462676565fa2141b4b7b174120a6578e9fa30e46"} Nov 27 11:24:39 crc kubenswrapper[4807]: I1127 11:24:39.665963 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whcvm"] Nov 27 11:24:40 crc kubenswrapper[4807]: I1127 11:24:40.375510 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whcvm" event={"ID":"d44ca316-19b2-4a04-97c2-a77db9c711b6","Type":"ContainerStarted","Data":"a2b1a8e4702185dee3f46f98d2b08caa1786f54dd2c9c9cb29036d406991c9db"} Nov 27 11:24:40 crc kubenswrapper[4807]: I1127 11:24:40.375842 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whcvm" event={"ID":"d44ca316-19b2-4a04-97c2-a77db9c711b6","Type":"ContainerStarted","Data":"6737d37ea3c2d8e686b82ee958e3d0db7ef209f7b727bdbdd1ad4b2a2174734b"} Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.279588 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.279641 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.355299 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.383100 4807 generic.go:334] "Generic (PLEG): container finished" podID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerID="a2b1a8e4702185dee3f46f98d2b08caa1786f54dd2c9c9cb29036d406991c9db" exitCode=0 Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.383182 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whcvm" event={"ID":"d44ca316-19b2-4a04-97c2-a77db9c711b6","Type":"ContainerDied","Data":"a2b1a8e4702185dee3f46f98d2b08caa1786f54dd2c9c9cb29036d406991c9db"} Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.455670 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.579596 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8p2mg"] Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.600644 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2f54p"] Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.603150 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.614882 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2f54p"] Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.724605 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-config\") pod \"dnsmasq-dns-666b6646f7-2f54p\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.724660 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2f54p\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.724676 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd29m\" (UniqueName: \"kubernetes.io/projected/b5e4033d-b980-406c-b771-080c120e6ee4-kube-api-access-wd29m\") pod \"dnsmasq-dns-666b6646f7-2f54p\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.826971 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vq5wc"] Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.827710 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-config\") pod \"dnsmasq-dns-666b6646f7-2f54p\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.827779 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd29m\" (UniqueName: \"kubernetes.io/projected/b5e4033d-b980-406c-b771-080c120e6ee4-kube-api-access-wd29m\") pod \"dnsmasq-dns-666b6646f7-2f54p\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.827800 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2f54p\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.828795 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-dns-svc\") pod \"dnsmasq-dns-666b6646f7-2f54p\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.829382 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-config\") pod \"dnsmasq-dns-666b6646f7-2f54p\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.850996 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jm8h6"] Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.853185 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.863443 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jm8h6"] Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.865701 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd29m\" (UniqueName: \"kubernetes.io/projected/b5e4033d-b980-406c-b771-080c120e6ee4-kube-api-access-wd29m\") pod \"dnsmasq-dns-666b6646f7-2f54p\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.926589 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.930155 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvjnk\" (UniqueName: \"kubernetes.io/projected/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-kube-api-access-dvjnk\") pod \"dnsmasq-dns-57d769cc4f-jm8h6\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.930203 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-config\") pod \"dnsmasq-dns-57d769cc4f-jm8h6\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:41 crc kubenswrapper[4807]: I1127 11:24:41.930234 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jm8h6\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.031437 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvjnk\" (UniqueName: \"kubernetes.io/projected/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-kube-api-access-dvjnk\") pod \"dnsmasq-dns-57d769cc4f-jm8h6\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.031506 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-config\") pod \"dnsmasq-dns-57d769cc4f-jm8h6\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.031541 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jm8h6\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.032929 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-config\") pod \"dnsmasq-dns-57d769cc4f-jm8h6\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.033533 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-jm8h6\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.052793 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvjnk\" (UniqueName: \"kubernetes.io/projected/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-kube-api-access-dvjnk\") pod \"dnsmasq-dns-57d769cc4f-jm8h6\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.196790 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.557780 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2f54p"] Nov 27 11:24:42 crc kubenswrapper[4807]: W1127 11:24:42.567431 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5e4033d_b980_406c_b771_080c120e6ee4.slice/crio-242c89baa5a7096dcf090b61f531a2f1c45ba6b14198fac6af3f4604607a5c32 WatchSource:0}: Error finding container 242c89baa5a7096dcf090b61f531a2f1c45ba6b14198fac6af3f4604607a5c32: Status 404 returned error can't find the container with id 242c89baa5a7096dcf090b61f531a2f1c45ba6b14198fac6af3f4604607a5c32 Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.716342 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.717807 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.720827 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jm8h6"] Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.721592 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.721674 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tt85w" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.721706 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.721793 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.721839 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.721975 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.722151 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.737726 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.890781 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.890824 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gnwm\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-kube-api-access-7gnwm\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.890862 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.890885 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.890909 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.890924 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.890941 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-config-data\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.890967 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e153e04c-cadb-4d8a-9863-9ef60eac08e9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.890984 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.891013 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.891031 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e153e04c-cadb-4d8a-9863-9ef60eac08e9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.988358 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.989535 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.991828 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e153e04c-cadb-4d8a-9863-9ef60eac08e9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.991851 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.991879 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.991898 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e153e04c-cadb-4d8a-9863-9ef60eac08e9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.991940 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.991955 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gnwm\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-kube-api-access-7gnwm\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.991982 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.991999 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.992021 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.992036 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.992058 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-config-data\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.992784 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-config-data\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.993626 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.993788 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.993824 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.993892 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.993926 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.993963 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.994392 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.994467 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.994714 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.995136 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.996796 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:42 crc kubenswrapper[4807]: I1127 11:24:42.999545 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f6jm6" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.004823 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e153e04c-cadb-4d8a-9863-9ef60eac08e9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.004941 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.006205 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.012938 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e153e04c-cadb-4d8a-9863-9ef60eac08e9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.022978 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.024448 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gnwm\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-kube-api-access-7gnwm\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.052656 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " pod="openstack/rabbitmq-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.063450 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.093623 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.093678 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.093769 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.093811 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.093891 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn86n\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-kube-api-access-rn86n\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.093929 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b811158c-3b16-415b-95df-baba9483d782-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.093947 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.093989 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.094041 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b811158c-3b16-415b-95df-baba9483d782-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.094078 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.094144 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.195143 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.195431 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.195454 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.195488 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.195521 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn86n\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-kube-api-access-rn86n\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.195562 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b811158c-3b16-415b-95df-baba9483d782-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.195581 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.195606 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.196652 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.201685 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b811158c-3b16-415b-95df-baba9483d782-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.201929 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.202170 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.202329 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.202528 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.202689 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.204025 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.205185 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b811158c-3b16-415b-95df-baba9483d782-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.206443 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b811158c-3b16-415b-95df-baba9483d782-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.225105 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.225801 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.226962 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.232949 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn86n\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-kube-api-access-rn86n\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.268163 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.413208 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" event={"ID":"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c","Type":"ContainerStarted","Data":"70daac6db7cb698d58d490aa69134fa37b4e76c4ac1c0115c33e04f2a3bb9782"} Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.415611 4807 generic.go:334] "Generic (PLEG): container finished" podID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerID="310ef764322c05b20f640d5bc2e4ea0e3e25f44a5ecb92df7212a44c0e372585" exitCode=0 Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.415732 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whcvm" event={"ID":"d44ca316-19b2-4a04-97c2-a77db9c711b6","Type":"ContainerDied","Data":"310ef764322c05b20f640d5bc2e4ea0e3e25f44a5ecb92df7212a44c0e372585"} Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.418695 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" event={"ID":"b5e4033d-b980-406c-b771-080c120e6ee4","Type":"ContainerStarted","Data":"242c89baa5a7096dcf090b61f531a2f1c45ba6b14198fac6af3f4604607a5c32"} Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.480906 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.509186 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.599365 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4xz7"] Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.599906 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p4xz7" podUID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerName="registry-server" containerID="cri-o://e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425" gracePeriod=2 Nov 27 11:24:43 crc kubenswrapper[4807]: I1127 11:24:43.719636 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.404793 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.406526 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.412996 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ckdc9" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.413490 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.416828 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.417534 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.419400 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.424878 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.445054 4807 generic.go:334] "Generic (PLEG): container finished" podID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerID="e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425" exitCode=0 Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.445121 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4xz7" event={"ID":"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7","Type":"ContainerDied","Data":"e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425"} Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.455456 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b811158c-3b16-415b-95df-baba9483d782","Type":"ContainerStarted","Data":"283f0d96b24f760cf6e323867815a3f05b30b2c0ef9be4d581634c55ea9127f6"} Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.458871 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e153e04c-cadb-4d8a-9863-9ef60eac08e9","Type":"ContainerStarted","Data":"9c9cd4fa96b0df6ce90c09b402bb6e5b15f47ed9cc97efe2e769bea43d7e936a"} Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.540439 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/122d837d-ee30-4e26-9e01-1f4bd8ebaace-config-data-default\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.540493 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.540528 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/122d837d-ee30-4e26-9e01-1f4bd8ebaace-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.540546 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsp9c\" (UniqueName: \"kubernetes.io/projected/122d837d-ee30-4e26-9e01-1f4bd8ebaace-kube-api-access-qsp9c\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.540576 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122d837d-ee30-4e26-9e01-1f4bd8ebaace-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.540595 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/122d837d-ee30-4e26-9e01-1f4bd8ebaace-kolla-config\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.540641 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/122d837d-ee30-4e26-9e01-1f4bd8ebaace-operator-scripts\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.540664 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/122d837d-ee30-4e26-9e01-1f4bd8ebaace-config-data-generated\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.645708 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/122d837d-ee30-4e26-9e01-1f4bd8ebaace-operator-scripts\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.645766 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/122d837d-ee30-4e26-9e01-1f4bd8ebaace-config-data-generated\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.645814 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/122d837d-ee30-4e26-9e01-1f4bd8ebaace-config-data-default\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.645850 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.645891 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/122d837d-ee30-4e26-9e01-1f4bd8ebaace-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.645914 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsp9c\" (UniqueName: \"kubernetes.io/projected/122d837d-ee30-4e26-9e01-1f4bd8ebaace-kube-api-access-qsp9c\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.645952 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122d837d-ee30-4e26-9e01-1f4bd8ebaace-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.645977 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/122d837d-ee30-4e26-9e01-1f4bd8ebaace-kolla-config\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.646768 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/122d837d-ee30-4e26-9e01-1f4bd8ebaace-kolla-config\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.646825 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/122d837d-ee30-4e26-9e01-1f4bd8ebaace-config-data-default\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.647057 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.647065 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/122d837d-ee30-4e26-9e01-1f4bd8ebaace-config-data-generated\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.648769 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/122d837d-ee30-4e26-9e01-1f4bd8ebaace-operator-scripts\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.653362 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/122d837d-ee30-4e26-9e01-1f4bd8ebaace-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.658001 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/122d837d-ee30-4e26-9e01-1f4bd8ebaace-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.672132 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.673705 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsp9c\" (UniqueName: \"kubernetes.io/projected/122d837d-ee30-4e26-9e01-1f4bd8ebaace-kube-api-access-qsp9c\") pod \"openstack-galera-0\" (UID: \"122d837d-ee30-4e26-9e01-1f4bd8ebaace\") " pod="openstack/openstack-galera-0" Nov 27 11:24:44 crc kubenswrapper[4807]: I1127 11:24:44.724547 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.725188 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.727080 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.731122 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.731485 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9p98c" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.731631 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.731741 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.732195 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.867135 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.867209 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx7sz\" (UniqueName: \"kubernetes.io/projected/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-kube-api-access-sx7sz\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.867262 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.867315 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.867362 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.867414 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.867475 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.867529 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.968688 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx7sz\" (UniqueName: \"kubernetes.io/projected/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-kube-api-access-sx7sz\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.968749 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.968780 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.968804 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.968837 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.968867 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.968897 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.968930 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.969366 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.969469 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.970493 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.971193 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.971770 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.973317 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.973449 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:45 crc kubenswrapper[4807]: I1127 11:24:45.986594 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx7sz\" (UniqueName: \"kubernetes.io/projected/6603c2ee-9ab6-476c-8db6-d073f0dec3aa-kube-api-access-sx7sz\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.001349 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6603c2ee-9ab6-476c-8db6-d073f0dec3aa\") " pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.060300 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.263445 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.264712 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.266529 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9mfcf" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.268649 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.268716 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.281722 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.374173 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ecce491-4a06-4922-8353-0586ac99471b-kolla-config\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.374480 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecce491-4a06-4922-8353-0586ac99471b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.374550 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecce491-4a06-4922-8353-0586ac99471b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.374731 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdk2m\" (UniqueName: \"kubernetes.io/projected/8ecce491-4a06-4922-8353-0586ac99471b-kube-api-access-hdk2m\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.374781 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ecce491-4a06-4922-8353-0586ac99471b-config-data\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.477873 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecce491-4a06-4922-8353-0586ac99471b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.477963 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdk2m\" (UniqueName: \"kubernetes.io/projected/8ecce491-4a06-4922-8353-0586ac99471b-kube-api-access-hdk2m\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.477987 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ecce491-4a06-4922-8353-0586ac99471b-config-data\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.478017 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ecce491-4a06-4922-8353-0586ac99471b-kolla-config\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.478036 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecce491-4a06-4922-8353-0586ac99471b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.478923 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ecce491-4a06-4922-8353-0586ac99471b-config-data\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.478946 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ecce491-4a06-4922-8353-0586ac99471b-kolla-config\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.485329 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecce491-4a06-4922-8353-0586ac99471b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.497843 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecce491-4a06-4922-8353-0586ac99471b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.499366 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdk2m\" (UniqueName: \"kubernetes.io/projected/8ecce491-4a06-4922-8353-0586ac99471b-kube-api-access-hdk2m\") pod \"memcached-0\" (UID: \"8ecce491-4a06-4922-8353-0586ac99471b\") " pod="openstack/memcached-0" Nov 27 11:24:46 crc kubenswrapper[4807]: I1127 11:24:46.610533 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 27 11:24:47 crc kubenswrapper[4807]: I1127 11:24:47.982686 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 11:24:47 crc kubenswrapper[4807]: I1127 11:24:47.983921 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 11:24:47 crc kubenswrapper[4807]: I1127 11:24:47.989447 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-rrtp9" Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.001683 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.107819 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcr4j\" (UniqueName: \"kubernetes.io/projected/8cfd9070-b1bd-4a24-b694-ae4ff059ec1c-kube-api-access-pcr4j\") pod \"kube-state-metrics-0\" (UID: \"8cfd9070-b1bd-4a24-b694-ae4ff059ec1c\") " pod="openstack/kube-state-metrics-0" Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.209823 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcr4j\" (UniqueName: \"kubernetes.io/projected/8cfd9070-b1bd-4a24-b694-ae4ff059ec1c-kube-api-access-pcr4j\") pod \"kube-state-metrics-0\" (UID: \"8cfd9070-b1bd-4a24-b694-ae4ff059ec1c\") " pod="openstack/kube-state-metrics-0" Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.235425 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcr4j\" (UniqueName: \"kubernetes.io/projected/8cfd9070-b1bd-4a24-b694-ae4ff059ec1c-kube-api-access-pcr4j\") pod \"kube-state-metrics-0\" (UID: \"8cfd9070-b1bd-4a24-b694-ae4ff059ec1c\") " pod="openstack/kube-state-metrics-0" Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.308101 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.816657 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wv2x5"] Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.818674 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.837935 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wv2x5"] Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.920715 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4-utilities\") pod \"community-operators-wv2x5\" (UID: \"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4\") " pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.920831 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4-catalog-content\") pod \"community-operators-wv2x5\" (UID: \"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4\") " pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:48 crc kubenswrapper[4807]: I1127 11:24:48.920921 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nql4q\" (UniqueName: \"kubernetes.io/projected/c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4-kube-api-access-nql4q\") pod \"community-operators-wv2x5\" (UID: \"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4\") " pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:49 crc kubenswrapper[4807]: I1127 11:24:49.022882 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nql4q\" (UniqueName: \"kubernetes.io/projected/c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4-kube-api-access-nql4q\") pod \"community-operators-wv2x5\" (UID: \"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4\") " pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:49 crc kubenswrapper[4807]: I1127 11:24:49.023105 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4-utilities\") pod \"community-operators-wv2x5\" (UID: \"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4\") " pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:49 crc kubenswrapper[4807]: I1127 11:24:49.023174 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4-catalog-content\") pod \"community-operators-wv2x5\" (UID: \"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4\") " pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:49 crc kubenswrapper[4807]: I1127 11:24:49.024103 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4-catalog-content\") pod \"community-operators-wv2x5\" (UID: \"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4\") " pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:49 crc kubenswrapper[4807]: I1127 11:24:49.024563 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4-utilities\") pod \"community-operators-wv2x5\" (UID: \"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4\") " pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:49 crc kubenswrapper[4807]: I1127 11:24:49.046020 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nql4q\" (UniqueName: \"kubernetes.io/projected/c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4-kube-api-access-nql4q\") pod \"community-operators-wv2x5\" (UID: \"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4\") " pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:49 crc kubenswrapper[4807]: I1127 11:24:49.148996 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:24:51 crc kubenswrapper[4807]: E1127 11:24:51.279465 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425 is running failed: container process not found" containerID="e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425" cmd=["grpc_health_probe","-addr=:50051"] Nov 27 11:24:51 crc kubenswrapper[4807]: E1127 11:24:51.280155 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425 is running failed: container process not found" containerID="e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425" cmd=["grpc_health_probe","-addr=:50051"] Nov 27 11:24:51 crc kubenswrapper[4807]: E1127 11:24:51.280844 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425 is running failed: container process not found" containerID="e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425" cmd=["grpc_health_probe","-addr=:50051"] Nov 27 11:24:51 crc kubenswrapper[4807]: E1127 11:24:51.280873 4807 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-p4xz7" podUID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerName="registry-server" Nov 27 11:24:51 crc kubenswrapper[4807]: I1127 11:24:51.704538 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:51 crc kubenswrapper[4807]: I1127 11:24:51.893007 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsvr5\" (UniqueName: \"kubernetes.io/projected/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-kube-api-access-rsvr5\") pod \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " Nov 27 11:24:51 crc kubenswrapper[4807]: I1127 11:24:51.893151 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-catalog-content\") pod \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " Nov 27 11:24:51 crc kubenswrapper[4807]: I1127 11:24:51.893203 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-utilities\") pod \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\" (UID: \"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7\") " Nov 27 11:24:51 crc kubenswrapper[4807]: I1127 11:24:51.894172 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-utilities" (OuterVolumeSpecName: "utilities") pod "04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" (UID: "04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:24:51 crc kubenswrapper[4807]: I1127 11:24:51.910697 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-kube-api-access-rsvr5" (OuterVolumeSpecName: "kube-api-access-rsvr5") pod "04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" (UID: "04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7"). InnerVolumeSpecName "kube-api-access-rsvr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.006825 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsvr5\" (UniqueName: \"kubernetes.io/projected/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-kube-api-access-rsvr5\") on node \"crc\" DevicePath \"\"" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.006867 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.050560 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" (UID: "04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.108157 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.173878 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-64nw4"] Nov 27 11:24:52 crc kubenswrapper[4807]: E1127 11:24:52.174228 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerName="extract-content" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.174259 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerName="extract-content" Nov 27 11:24:52 crc kubenswrapper[4807]: E1127 11:24:52.174295 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerName="extract-utilities" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.174301 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerName="extract-utilities" Nov 27 11:24:52 crc kubenswrapper[4807]: E1127 11:24:52.174325 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerName="registry-server" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.174334 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerName="registry-server" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.174506 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" containerName="registry-server" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.175064 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.177775 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.178079 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.178588 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nrhwv" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.180684 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-26rzj"] Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.182271 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.187103 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-64nw4"] Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.206036 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-26rzj"] Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212013 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/356f01bb-6304-499b-946d-1e9f3d6e7572-ovn-controller-tls-certs\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212067 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6nvx\" (UniqueName: \"kubernetes.io/projected/9bac140b-1bac-4d27-bb66-111e66af1edf-kube-api-access-q6nvx\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212146 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-var-lib\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212180 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/356f01bb-6304-499b-946d-1e9f3d6e7572-var-run-ovn\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212221 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/356f01bb-6304-499b-946d-1e9f3d6e7572-scripts\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212262 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/356f01bb-6304-499b-946d-1e9f3d6e7572-var-log-ovn\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212308 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-var-run\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212337 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-etc-ovs\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212372 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-var-log\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212447 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356f01bb-6304-499b-946d-1e9f3d6e7572-combined-ca-bundle\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212495 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/356f01bb-6304-499b-946d-1e9f3d6e7572-var-run\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212553 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d6dc\" (UniqueName: \"kubernetes.io/projected/356f01bb-6304-499b-946d-1e9f3d6e7572-kube-api-access-5d6dc\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.212606 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bac140b-1bac-4d27-bb66-111e66af1edf-scripts\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.315366 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/356f01bb-6304-499b-946d-1e9f3d6e7572-var-run\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.315463 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d6dc\" (UniqueName: \"kubernetes.io/projected/356f01bb-6304-499b-946d-1e9f3d6e7572-kube-api-access-5d6dc\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.315687 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bac140b-1bac-4d27-bb66-111e66af1edf-scripts\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.315773 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/356f01bb-6304-499b-946d-1e9f3d6e7572-ovn-controller-tls-certs\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.315807 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6nvx\" (UniqueName: \"kubernetes.io/projected/9bac140b-1bac-4d27-bb66-111e66af1edf-kube-api-access-q6nvx\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.315884 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-var-lib\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.315912 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/356f01bb-6304-499b-946d-1e9f3d6e7572-var-run-ovn\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.315949 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/356f01bb-6304-499b-946d-1e9f3d6e7572-scripts\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.315972 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/356f01bb-6304-499b-946d-1e9f3d6e7572-var-log-ovn\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.316029 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-var-run\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.316059 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-etc-ovs\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.316094 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-var-log\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.316163 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356f01bb-6304-499b-946d-1e9f3d6e7572-combined-ca-bundle\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.316372 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/356f01bb-6304-499b-946d-1e9f3d6e7572-var-run\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.316485 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-var-run\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.316532 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/356f01bb-6304-499b-946d-1e9f3d6e7572-var-run-ovn\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.316798 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-var-lib\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.317092 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/356f01bb-6304-499b-946d-1e9f3d6e7572-var-log-ovn\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.317141 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-etc-ovs\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.317269 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9bac140b-1bac-4d27-bb66-111e66af1edf-var-log\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.319126 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/356f01bb-6304-499b-946d-1e9f3d6e7572-scripts\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.320988 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/356f01bb-6304-499b-946d-1e9f3d6e7572-combined-ca-bundle\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.321837 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/356f01bb-6304-499b-946d-1e9f3d6e7572-ovn-controller-tls-certs\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.334416 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bac140b-1bac-4d27-bb66-111e66af1edf-scripts\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.335037 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d6dc\" (UniqueName: \"kubernetes.io/projected/356f01bb-6304-499b-946d-1e9f3d6e7572-kube-api-access-5d6dc\") pod \"ovn-controller-64nw4\" (UID: \"356f01bb-6304-499b-946d-1e9f3d6e7572\") " pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.346437 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6nvx\" (UniqueName: \"kubernetes.io/projected/9bac140b-1bac-4d27-bb66-111e66af1edf-kube-api-access-q6nvx\") pod \"ovn-controller-ovs-26rzj\" (UID: \"9bac140b-1bac-4d27-bb66-111e66af1edf\") " pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.506508 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-64nw4" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.514622 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.524107 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4xz7" event={"ID":"04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7","Type":"ContainerDied","Data":"a99f9b1390513c768f25ca2ddb7a96c817c47784f418ef927815d2f428c2117b"} Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.524175 4807 scope.go:117] "RemoveContainer" containerID="e6b167b6ae552deb5b6d0e69e851470f040d9d1ebc806cfa5e35ab060bc57425" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.524323 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4xz7" Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.587721 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4xz7"] Nov 27 11:24:52 crc kubenswrapper[4807]: I1127 11:24:52.594448 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p4xz7"] Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.051831 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.053397 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.057659 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.058028 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-p4k7k" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.058362 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.058509 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.063505 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.071122 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.229371 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e47ac50c-3a93-46fd-94f2-5c83e02e1919-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.229669 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sqj2\" (UniqueName: \"kubernetes.io/projected/e47ac50c-3a93-46fd-94f2-5c83e02e1919-kube-api-access-2sqj2\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.229691 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e47ac50c-3a93-46fd-94f2-5c83e02e1919-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.229729 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e47ac50c-3a93-46fd-94f2-5c83e02e1919-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.229751 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e47ac50c-3a93-46fd-94f2-5c83e02e1919-config\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.229794 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e47ac50c-3a93-46fd-94f2-5c83e02e1919-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.229811 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.229831 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e47ac50c-3a93-46fd-94f2-5c83e02e1919-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.331715 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e47ac50c-3a93-46fd-94f2-5c83e02e1919-config\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.331814 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e47ac50c-3a93-46fd-94f2-5c83e02e1919-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.331832 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.331851 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e47ac50c-3a93-46fd-94f2-5c83e02e1919-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.331880 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e47ac50c-3a93-46fd-94f2-5c83e02e1919-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.331928 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sqj2\" (UniqueName: \"kubernetes.io/projected/e47ac50c-3a93-46fd-94f2-5c83e02e1919-kube-api-access-2sqj2\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.331945 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e47ac50c-3a93-46fd-94f2-5c83e02e1919-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.331977 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e47ac50c-3a93-46fd-94f2-5c83e02e1919-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.332482 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e47ac50c-3a93-46fd-94f2-5c83e02e1919-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.332617 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e47ac50c-3a93-46fd-94f2-5c83e02e1919-config\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.333504 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.334103 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e47ac50c-3a93-46fd-94f2-5c83e02e1919-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.337868 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e47ac50c-3a93-46fd-94f2-5c83e02e1919-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.347309 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e47ac50c-3a93-46fd-94f2-5c83e02e1919-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.347378 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e47ac50c-3a93-46fd-94f2-5c83e02e1919-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.347795 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sqj2\" (UniqueName: \"kubernetes.io/projected/e47ac50c-3a93-46fd-94f2-5c83e02e1919-kube-api-access-2sqj2\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.353025 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"e47ac50c-3a93-46fd-94f2-5c83e02e1919\") " pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.374719 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 27 11:24:53 crc kubenswrapper[4807]: I1127 11:24:53.546986 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7" path="/var/lib/kubelet/pods/04cf0017-8ab1-4a1a-848e-9bbbdd3d36e7/volumes" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.184695 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.185889 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.188391 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6vbq9" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.188428 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.188454 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.188910 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.204560 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.384611 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a8b97df-a50b-4cce-8035-28b23cbdaf72-config\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.384686 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8b97df-a50b-4cce-8035-28b23cbdaf72-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.384709 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btqw7\" (UniqueName: \"kubernetes.io/projected/0a8b97df-a50b-4cce-8035-28b23cbdaf72-kube-api-access-btqw7\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.384750 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.384772 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0a8b97df-a50b-4cce-8035-28b23cbdaf72-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.385059 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8b97df-a50b-4cce-8035-28b23cbdaf72-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.385168 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a8b97df-a50b-4cce-8035-28b23cbdaf72-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.385275 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8b97df-a50b-4cce-8035-28b23cbdaf72-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.486759 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a8b97df-a50b-4cce-8035-28b23cbdaf72-config\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.486818 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8b97df-a50b-4cce-8035-28b23cbdaf72-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.486839 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btqw7\" (UniqueName: \"kubernetes.io/projected/0a8b97df-a50b-4cce-8035-28b23cbdaf72-kube-api-access-btqw7\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.486859 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.486876 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0a8b97df-a50b-4cce-8035-28b23cbdaf72-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.486937 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8b97df-a50b-4cce-8035-28b23cbdaf72-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.486955 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a8b97df-a50b-4cce-8035-28b23cbdaf72-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.486995 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8b97df-a50b-4cce-8035-28b23cbdaf72-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.487199 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.487726 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0a8b97df-a50b-4cce-8035-28b23cbdaf72-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.487957 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a8b97df-a50b-4cce-8035-28b23cbdaf72-config\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.488166 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0a8b97df-a50b-4cce-8035-28b23cbdaf72-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.490936 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8b97df-a50b-4cce-8035-28b23cbdaf72-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.491082 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a8b97df-a50b-4cce-8035-28b23cbdaf72-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.491405 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a8b97df-a50b-4cce-8035-28b23cbdaf72-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.503727 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btqw7\" (UniqueName: \"kubernetes.io/projected/0a8b97df-a50b-4cce-8035-28b23cbdaf72-kube-api-access-btqw7\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.513689 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"0a8b97df-a50b-4cce-8035-28b23cbdaf72\") " pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:55 crc kubenswrapper[4807]: I1127 11:24:55.806608 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 27 11:24:57 crc kubenswrapper[4807]: E1127 11:24:57.836239 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 27 11:24:57 crc kubenswrapper[4807]: E1127 11:24:57.836997 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vc7c9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-vq5wc_openstack(3b90b03e-51e6-4862-acff-02a1b895a10e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:24:57 crc kubenswrapper[4807]: E1127 11:24:57.838398 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" podUID="3b90b03e-51e6-4862-acff-02a1b895a10e" Nov 27 11:24:57 crc kubenswrapper[4807]: I1127 11:24:57.961714 4807 scope.go:117] "RemoveContainer" containerID="33021570d0f703402e59a2914746f48142ec5e9f8f7908b32ca4f7662b1b8467" Nov 27 11:24:58 crc kubenswrapper[4807]: E1127 11:24:58.018001 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 27 11:24:58 crc kubenswrapper[4807]: E1127 11:24:58.018425 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h7pdn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-8p2mg_openstack(e52e0389-6f80-456a-8763-e813b6eab09c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:24:58 crc kubenswrapper[4807]: E1127 11:24:58.019642 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" podUID="e52e0389-6f80-456a-8763-e813b6eab09c" Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.127343 4807 scope.go:117] "RemoveContainer" containerID="940b0267ddbbf2d33a7f2b391875a4978edaa658bcf0f1cfdc2c9852c90fb598" Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.295472 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 27 11:24:58 crc kubenswrapper[4807]: W1127 11:24:58.304445 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod122d837d_ee30_4e26_9e01_1f4bd8ebaace.slice/crio-f8b527030c607fa46605710629e42f14836657fc369b76bab5bcc4e60dbc889f WatchSource:0}: Error finding container f8b527030c607fa46605710629e42f14836657fc369b76bab5bcc4e60dbc889f: Status 404 returned error can't find the container with id f8b527030c607fa46605710629e42f14836657fc369b76bab5bcc4e60dbc889f Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.514793 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.528267 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.545963 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 27 11:24:58 crc kubenswrapper[4807]: W1127 11:24:58.560884 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ecce491_4a06_4922_8353_0586ac99471b.slice/crio-ad846a4372351430438990894e5c455590271e0b82eb382d65ef833469cb608d WatchSource:0}: Error finding container ad846a4372351430438990894e5c455590271e0b82eb382d65ef833469cb608d: Status 404 returned error can't find the container with id ad846a4372351430438990894e5c455590271e0b82eb382d65ef833469cb608d Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.592284 4807 generic.go:334] "Generic (PLEG): container finished" podID="b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" containerID="d02eddd35d46dadeec18c4a0a36460fa3fd3f8ca50dc6de477b4904951b50a56" exitCode=0 Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.592348 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" event={"ID":"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c","Type":"ContainerDied","Data":"d02eddd35d46dadeec18c4a0a36460fa3fd3f8ca50dc6de477b4904951b50a56"} Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.593234 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6603c2ee-9ab6-476c-8db6-d073f0dec3aa","Type":"ContainerStarted","Data":"506accc09bcf2eb1c877e3f0d95b23f24e7744f2a817d4a92bc424c94b564369"} Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.595713 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whcvm" event={"ID":"d44ca316-19b2-4a04-97c2-a77db9c711b6","Type":"ContainerStarted","Data":"c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a"} Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.597290 4807 generic.go:334] "Generic (PLEG): container finished" podID="b5e4033d-b980-406c-b771-080c120e6ee4" containerID="9bcaf7a884dea0a3cb2589d64b0013a579cce8fa382cd962475d2d7394a07ee4" exitCode=0 Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.597358 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" event={"ID":"b5e4033d-b980-406c-b771-080c120e6ee4","Type":"ContainerDied","Data":"9bcaf7a884dea0a3cb2589d64b0013a579cce8fa382cd962475d2d7394a07ee4"} Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.600194 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"122d837d-ee30-4e26-9e01-1f4bd8ebaace","Type":"ContainerStarted","Data":"f8b527030c607fa46605710629e42f14836657fc369b76bab5bcc4e60dbc889f"} Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.601035 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8ecce491-4a06-4922-8353-0586ac99471b","Type":"ContainerStarted","Data":"ad846a4372351430438990894e5c455590271e0b82eb382d65ef833469cb608d"} Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.603005 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8cfd9070-b1bd-4a24-b694-ae4ff059ec1c","Type":"ContainerStarted","Data":"d9b7eb50fc8002c0dd8c6db3eb15b084899f99b7a36986ec1b09c8a004727220"} Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.709315 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-whcvm" podStartSLOduration=4.067383576 podStartE2EDuration="20.709290959s" podCreationTimestamp="2025-11-27 11:24:38 +0000 UTC" firstStartedPulling="2025-11-27 11:24:41.385521395 +0000 UTC m=+922.485019593" lastFinishedPulling="2025-11-27 11:24:58.027428778 +0000 UTC m=+939.126926976" observedRunningTime="2025-11-27 11:24:58.697314573 +0000 UTC m=+939.796812801" watchObservedRunningTime="2025-11-27 11:24:58.709290959 +0000 UTC m=+939.808789157" Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.750553 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wv2x5"] Nov 27 11:24:58 crc kubenswrapper[4807]: I1127 11:24:58.830848 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.038025 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-64nw4"] Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.129840 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 27 11:24:59 crc kubenswrapper[4807]: W1127 11:24:59.139605 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode47ac50c_3a93_46fd_94f2_5c83e02e1919.slice/crio-8069c66d14770d24ddcb0304d452d3257476426a6a9650d828bd471f226122d1 WatchSource:0}: Error finding container 8069c66d14770d24ddcb0304d452d3257476426a6a9650d828bd471f226122d1: Status 404 returned error can't find the container with id 8069c66d14770d24ddcb0304d452d3257476426a6a9650d828bd471f226122d1 Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.153605 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.153643 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.561765 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.572197 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.642068 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e153e04c-cadb-4d8a-9863-9ef60eac08e9","Type":"ContainerStarted","Data":"83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.643545 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" event={"ID":"3b90b03e-51e6-4862-acff-02a1b895a10e","Type":"ContainerDied","Data":"227d8a8ba9ffab4ced5fb9ed462676565fa2141b4b7b174120a6578e9fa30e46"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.643618 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vq5wc" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.645292 4807 generic.go:334] "Generic (PLEG): container finished" podID="c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4" containerID="1358e715ad8262e0e277b0251ed498ce2489c0a856f16583a240fc59bcda7d26" exitCode=0 Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.645365 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv2x5" event={"ID":"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4","Type":"ContainerDied","Data":"1358e715ad8262e0e277b0251ed498ce2489c0a856f16583a240fc59bcda7d26"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.645388 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv2x5" event={"ID":"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4","Type":"ContainerStarted","Data":"33a66c1375d4ced76a64e74c82881c7e10557090f28f1351150ff34043fb3158"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.648200 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" event={"ID":"b5e4033d-b980-406c-b771-080c120e6ee4","Type":"ContainerStarted","Data":"77b4f1ddc63c72b0deb7d2650e634467a0363051378d934a13c4b03db57aa0cf"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.648288 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.652759 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0a8b97df-a50b-4cce-8035-28b23cbdaf72","Type":"ContainerStarted","Data":"033562d1bd92a4fc85e491c6855d905612fe3bda29698093e519921dc1f9f05c"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.655357 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" event={"ID":"e52e0389-6f80-456a-8763-e813b6eab09c","Type":"ContainerDied","Data":"0ea4031085fef19f6d4360db5581b902291c2af2237f5e7a9e9d19898365e5fa"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.655449 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-8p2mg" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.661020 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b811158c-3b16-415b-95df-baba9483d782","Type":"ContainerStarted","Data":"9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.662534 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e47ac50c-3a93-46fd-94f2-5c83e02e1919","Type":"ContainerStarted","Data":"8069c66d14770d24ddcb0304d452d3257476426a6a9650d828bd471f226122d1"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.663845 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7pdn\" (UniqueName: \"kubernetes.io/projected/e52e0389-6f80-456a-8763-e813b6eab09c-kube-api-access-h7pdn\") pod \"e52e0389-6f80-456a-8763-e813b6eab09c\" (UID: \"e52e0389-6f80-456a-8763-e813b6eab09c\") " Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.663883 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52e0389-6f80-456a-8763-e813b6eab09c-config\") pod \"e52e0389-6f80-456a-8763-e813b6eab09c\" (UID: \"e52e0389-6f80-456a-8763-e813b6eab09c\") " Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.664149 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-64nw4" event={"ID":"356f01bb-6304-499b-946d-1e9f3d6e7572","Type":"ContainerStarted","Data":"25db404533a727b483411834898abe5e97ce3d100658ef23365562d30291fa66"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.664652 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e52e0389-6f80-456a-8763-e813b6eab09c-config" (OuterVolumeSpecName: "config") pod "e52e0389-6f80-456a-8763-e813b6eab09c" (UID: "e52e0389-6f80-456a-8763-e813b6eab09c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.664928 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-config\") pod \"3b90b03e-51e6-4862-acff-02a1b895a10e\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.664956 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc7c9\" (UniqueName: \"kubernetes.io/projected/3b90b03e-51e6-4862-acff-02a1b895a10e-kube-api-access-vc7c9\") pod \"3b90b03e-51e6-4862-acff-02a1b895a10e\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.665214 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-dns-svc\") pod \"3b90b03e-51e6-4862-acff-02a1b895a10e\" (UID: \"3b90b03e-51e6-4862-acff-02a1b895a10e\") " Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.665456 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-config" (OuterVolumeSpecName: "config") pod "3b90b03e-51e6-4862-acff-02a1b895a10e" (UID: "3b90b03e-51e6-4862-acff-02a1b895a10e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.665639 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e52e0389-6f80-456a-8763-e813b6eab09c-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.665656 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.665836 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b90b03e-51e6-4862-acff-02a1b895a10e" (UID: "3b90b03e-51e6-4862-acff-02a1b895a10e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.667910 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" event={"ID":"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c","Type":"ContainerStarted","Data":"3f4d05167779ed156fdf7b189cf31b61502f240a4cae0121613e05a0b9c2073a"} Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.667937 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.671260 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e52e0389-6f80-456a-8763-e813b6eab09c-kube-api-access-h7pdn" (OuterVolumeSpecName: "kube-api-access-h7pdn") pod "e52e0389-6f80-456a-8763-e813b6eab09c" (UID: "e52e0389-6f80-456a-8763-e813b6eab09c"). InnerVolumeSpecName "kube-api-access-h7pdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.683547 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b90b03e-51e6-4862-acff-02a1b895a10e-kube-api-access-vc7c9" (OuterVolumeSpecName: "kube-api-access-vc7c9") pod "3b90b03e-51e6-4862-acff-02a1b895a10e" (UID: "3b90b03e-51e6-4862-acff-02a1b895a10e"). InnerVolumeSpecName "kube-api-access-vc7c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.767030 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7pdn\" (UniqueName: \"kubernetes.io/projected/e52e0389-6f80-456a-8763-e813b6eab09c-kube-api-access-h7pdn\") on node \"crc\" DevicePath \"\"" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.767065 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc7c9\" (UniqueName: \"kubernetes.io/projected/3b90b03e-51e6-4862-acff-02a1b895a10e-kube-api-access-vc7c9\") on node \"crc\" DevicePath \"\"" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.767075 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b90b03e-51e6-4862-acff-02a1b895a10e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.767418 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" podStartSLOduration=3.210445672 podStartE2EDuration="18.767405538s" podCreationTimestamp="2025-11-27 11:24:41 +0000 UTC" firstStartedPulling="2025-11-27 11:24:42.570399022 +0000 UTC m=+923.669897220" lastFinishedPulling="2025-11-27 11:24:58.127358888 +0000 UTC m=+939.226857086" observedRunningTime="2025-11-27 11:24:59.762141569 +0000 UTC m=+940.861639767" watchObservedRunningTime="2025-11-27 11:24:59.767405538 +0000 UTC m=+940.866903736" Nov 27 11:24:59 crc kubenswrapper[4807]: I1127 11:24:59.997761 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" podStartSLOduration=3.7103551980000002 podStartE2EDuration="18.997742163s" podCreationTimestamp="2025-11-27 11:24:41 +0000 UTC" firstStartedPulling="2025-11-27 11:24:42.744388848 +0000 UTC m=+923.843887046" lastFinishedPulling="2025-11-27 11:24:58.031775813 +0000 UTC m=+939.131274011" observedRunningTime="2025-11-27 11:24:59.838784114 +0000 UTC m=+940.938282312" watchObservedRunningTime="2025-11-27 11:24:59.997742163 +0000 UTC m=+941.097240351" Nov 27 11:25:00 crc kubenswrapper[4807]: I1127 11:25:00.071463 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vq5wc"] Nov 27 11:25:00 crc kubenswrapper[4807]: I1127 11:25:00.080492 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vq5wc"] Nov 27 11:25:00 crc kubenswrapper[4807]: I1127 11:25:00.100734 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-26rzj"] Nov 27 11:25:00 crc kubenswrapper[4807]: I1127 11:25:00.120960 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8p2mg"] Nov 27 11:25:00 crc kubenswrapper[4807]: I1127 11:25:00.126659 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-8p2mg"] Nov 27 11:25:00 crc kubenswrapper[4807]: I1127 11:25:00.264834 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-whcvm" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerName="registry-server" probeResult="failure" output=< Nov 27 11:25:00 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Nov 27 11:25:00 crc kubenswrapper[4807]: > Nov 27 11:25:01 crc kubenswrapper[4807]: I1127 11:25:01.546567 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b90b03e-51e6-4862-acff-02a1b895a10e" path="/var/lib/kubelet/pods/3b90b03e-51e6-4862-acff-02a1b895a10e/volumes" Nov 27 11:25:01 crc kubenswrapper[4807]: I1127 11:25:01.547376 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e52e0389-6f80-456a-8763-e813b6eab09c" path="/var/lib/kubelet/pods/e52e0389-6f80-456a-8763-e813b6eab09c/volumes" Nov 27 11:25:02 crc kubenswrapper[4807]: I1127 11:25:02.700518 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-26rzj" event={"ID":"9bac140b-1bac-4d27-bb66-111e66af1edf","Type":"ContainerStarted","Data":"9d0ed16ad0aaeca5f92d6dcf79e960dd01972cf282df84dbffab10d538e7213b"} Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.254525 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w9gjp"] Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.261425 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9gjp"] Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.261527 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.338936 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-utilities\") pod \"redhat-marketplace-w9gjp\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.339347 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k6mz\" (UniqueName: \"kubernetes.io/projected/72480ac5-37a1-414b-b364-4290d9525ddb-kube-api-access-5k6mz\") pod \"redhat-marketplace-w9gjp\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.339484 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-catalog-content\") pod \"redhat-marketplace-w9gjp\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.440891 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-catalog-content\") pod \"redhat-marketplace-w9gjp\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.440941 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-utilities\") pod \"redhat-marketplace-w9gjp\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.440988 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k6mz\" (UniqueName: \"kubernetes.io/projected/72480ac5-37a1-414b-b364-4290d9525ddb-kube-api-access-5k6mz\") pod \"redhat-marketplace-w9gjp\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.441472 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-catalog-content\") pod \"redhat-marketplace-w9gjp\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.441711 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-utilities\") pod \"redhat-marketplace-w9gjp\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.483005 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k6mz\" (UniqueName: \"kubernetes.io/projected/72480ac5-37a1-414b-b364-4290d9525ddb-kube-api-access-5k6mz\") pod \"redhat-marketplace-w9gjp\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:04 crc kubenswrapper[4807]: I1127 11:25:04.596450 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:06 crc kubenswrapper[4807]: I1127 11:25:06.928411 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:25:07 crc kubenswrapper[4807]: I1127 11:25:07.203226 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:25:07 crc kubenswrapper[4807]: I1127 11:25:07.262867 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2f54p"] Nov 27 11:25:07 crc kubenswrapper[4807]: I1127 11:25:07.746276 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" podUID="b5e4033d-b980-406c-b771-080c120e6ee4" containerName="dnsmasq-dns" containerID="cri-o://77b4f1ddc63c72b0deb7d2650e634467a0363051378d934a13c4b03db57aa0cf" gracePeriod=10 Nov 27 11:25:08 crc kubenswrapper[4807]: I1127 11:25:08.756920 4807 generic.go:334] "Generic (PLEG): container finished" podID="b5e4033d-b980-406c-b771-080c120e6ee4" containerID="77b4f1ddc63c72b0deb7d2650e634467a0363051378d934a13c4b03db57aa0cf" exitCode=0 Nov 27 11:25:08 crc kubenswrapper[4807]: I1127 11:25:08.756968 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" event={"ID":"b5e4033d-b980-406c-b771-080c120e6ee4","Type":"ContainerDied","Data":"77b4f1ddc63c72b0deb7d2650e634467a0363051378d934a13c4b03db57aa0cf"} Nov 27 11:25:09 crc kubenswrapper[4807]: I1127 11:25:09.233062 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:25:09 crc kubenswrapper[4807]: I1127 11:25:09.272428 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:25:10 crc kubenswrapper[4807]: I1127 11:25:10.021913 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whcvm"] Nov 27 11:25:10 crc kubenswrapper[4807]: I1127 11:25:10.769478 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-whcvm" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerName="registry-server" containerID="cri-o://c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a" gracePeriod=2 Nov 27 11:25:11 crc kubenswrapper[4807]: I1127 11:25:11.928101 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" podUID="b5e4033d-b980-406c-b771-080c120e6ee4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.94:5353: connect: connection refused" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.362140 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6pfcf"] Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.363843 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.367226 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.371741 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/186b6a8f-d303-440b-99ea-6502bac3e583-combined-ca-bundle\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.371898 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktz2p\" (UniqueName: \"kubernetes.io/projected/186b6a8f-d303-440b-99ea-6502bac3e583-kube-api-access-ktz2p\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.372092 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/186b6a8f-d303-440b-99ea-6502bac3e583-ovn-rundir\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.372169 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/186b6a8f-d303-440b-99ea-6502bac3e583-ovs-rundir\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.372346 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186b6a8f-d303-440b-99ea-6502bac3e583-config\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.372432 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/186b6a8f-d303-440b-99ea-6502bac3e583-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.376654 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6pfcf"] Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.473707 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/186b6a8f-d303-440b-99ea-6502bac3e583-combined-ca-bundle\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.473765 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktz2p\" (UniqueName: \"kubernetes.io/projected/186b6a8f-d303-440b-99ea-6502bac3e583-kube-api-access-ktz2p\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.473805 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/186b6a8f-d303-440b-99ea-6502bac3e583-ovn-rundir\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.473862 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/186b6a8f-d303-440b-99ea-6502bac3e583-ovs-rundir\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.473896 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186b6a8f-d303-440b-99ea-6502bac3e583-config\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.473934 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/186b6a8f-d303-440b-99ea-6502bac3e583-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.474212 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/186b6a8f-d303-440b-99ea-6502bac3e583-ovn-rundir\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.474241 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/186b6a8f-d303-440b-99ea-6502bac3e583-ovs-rundir\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.475456 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/186b6a8f-d303-440b-99ea-6502bac3e583-config\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.475613 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-fsvdm"] Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.476825 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.479941 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.480662 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/186b6a8f-d303-440b-99ea-6502bac3e583-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.484849 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/186b6a8f-d303-440b-99ea-6502bac3e583-combined-ca-bundle\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.496434 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-fsvdm"] Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.496839 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktz2p\" (UniqueName: \"kubernetes.io/projected/186b6a8f-d303-440b-99ea-6502bac3e583-kube-api-access-ktz2p\") pod \"ovn-controller-metrics-6pfcf\" (UID: \"186b6a8f-d303-440b-99ea-6502bac3e583\") " pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.575981 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.576085 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-config\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.576105 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j2ns\" (UniqueName: \"kubernetes.io/projected/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-kube-api-access-5j2ns\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.576135 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.610811 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-fsvdm"] Nov 27 11:25:15 crc kubenswrapper[4807]: E1127 11:25:15.611371 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-5j2ns ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" podUID="255b51ea-f16b-49c5-a0ad-07bbedebbdaf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.652913 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlflh"] Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.654173 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.655989 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.661859 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlflh"] Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.677227 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.677329 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-config\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.677352 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j2ns\" (UniqueName: \"kubernetes.io/projected/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-kube-api-access-5j2ns\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.677373 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.678135 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.678234 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-config\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.678263 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.684770 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6pfcf" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.697217 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j2ns\" (UniqueName: \"kubernetes.io/projected/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-kube-api-access-5j2ns\") pod \"dnsmasq-dns-7f896c8c65-fsvdm\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.779511 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqkw9\" (UniqueName: \"kubernetes.io/projected/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-kube-api-access-tqkw9\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.779759 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.779943 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.780030 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.780095 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-config\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.818418 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.827288 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.880828 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-ovsdbserver-sb\") pod \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.880896 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-config\") pod \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.880921 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-dns-svc\") pod \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.880995 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j2ns\" (UniqueName: \"kubernetes.io/projected/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-kube-api-access-5j2ns\") pod \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\" (UID: \"255b51ea-f16b-49c5-a0ad-07bbedebbdaf\") " Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.881090 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.881134 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.881163 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.881185 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-config\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.881217 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqkw9\" (UniqueName: \"kubernetes.io/projected/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-kube-api-access-tqkw9\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.881317 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "255b51ea-f16b-49c5-a0ad-07bbedebbdaf" (UID: "255b51ea-f16b-49c5-a0ad-07bbedebbdaf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.881354 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-config" (OuterVolumeSpecName: "config") pod "255b51ea-f16b-49c5-a0ad-07bbedebbdaf" (UID: "255b51ea-f16b-49c5-a0ad-07bbedebbdaf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.881368 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "255b51ea-f16b-49c5-a0ad-07bbedebbdaf" (UID: "255b51ea-f16b-49c5-a0ad-07bbedebbdaf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.882144 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.882094 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.882287 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-config\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.882363 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.885682 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-kube-api-access-5j2ns" (OuterVolumeSpecName: "kube-api-access-5j2ns") pod "255b51ea-f16b-49c5-a0ad-07bbedebbdaf" (UID: "255b51ea-f16b-49c5-a0ad-07bbedebbdaf"). InnerVolumeSpecName "kube-api-access-5j2ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.897322 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqkw9\" (UniqueName: \"kubernetes.io/projected/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-kube-api-access-tqkw9\") pod \"dnsmasq-dns-86db49b7ff-zlflh\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.969031 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.982498 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j2ns\" (UniqueName: \"kubernetes.io/projected/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-kube-api-access-5j2ns\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.982533 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.982545 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:15 crc kubenswrapper[4807]: I1127 11:25:15.982556 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/255b51ea-f16b-49c5-a0ad-07bbedebbdaf-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:16 crc kubenswrapper[4807]: I1127 11:25:16.827486 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-fsvdm" Nov 27 11:25:16 crc kubenswrapper[4807]: I1127 11:25:16.902970 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-fsvdm"] Nov 27 11:25:16 crc kubenswrapper[4807]: I1127 11:25:16.908590 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-fsvdm"] Nov 27 11:25:16 crc kubenswrapper[4807]: I1127 11:25:16.928049 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" podUID="b5e4033d-b980-406c-b771-080c120e6ee4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.94:5353: connect: connection refused" Nov 27 11:25:17 crc kubenswrapper[4807]: I1127 11:25:17.542766 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="255b51ea-f16b-49c5-a0ad-07bbedebbdaf" path="/var/lib/kubelet/pods/255b51ea-f16b-49c5-a0ad-07bbedebbdaf/volumes" Nov 27 11:25:18 crc kubenswrapper[4807]: E1127 11:25:18.745023 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 27 11:25:18 crc kubenswrapper[4807]: E1127 11:25:18.745181 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nql4q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wv2x5_openshift-marketplace(c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 11:25:18 crc kubenswrapper[4807]: E1127 11:25:18.746373 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wv2x5" podUID="c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4" Nov 27 11:25:18 crc kubenswrapper[4807]: I1127 11:25:18.854999 4807 generic.go:334] "Generic (PLEG): container finished" podID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerID="c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a" exitCode=0 Nov 27 11:25:18 crc kubenswrapper[4807]: I1127 11:25:18.855076 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whcvm" event={"ID":"d44ca316-19b2-4a04-97c2-a77db9c711b6","Type":"ContainerDied","Data":"c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a"} Nov 27 11:25:18 crc kubenswrapper[4807]: E1127 11:25:18.932811 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wv2x5" podUID="c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4" Nov 27 11:25:18 crc kubenswrapper[4807]: E1127 11:25:18.987641 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Nov 27 11:25:18 crc kubenswrapper[4807]: E1127 11:25:18.987885 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66fh9dh75h8hdch569h56ch74h5cbh65fhd8hdh9h66h548h7fh66h99hf8h554h5dch657h59bh669h579hdbh59dhf9h557h588h587h5c9q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-btqw7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(0a8b97df-a50b-4cce-8035-28b23cbdaf72): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:25:19 crc kubenswrapper[4807]: E1127 11:25:19.154234 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a is running failed: container process not found" containerID="c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a" cmd=["grpc_health_probe","-addr=:50051"] Nov 27 11:25:19 crc kubenswrapper[4807]: E1127 11:25:19.156499 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a is running failed: container process not found" containerID="c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a" cmd=["grpc_health_probe","-addr=:50051"] Nov 27 11:25:19 crc kubenswrapper[4807]: E1127 11:25:19.156839 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a is running failed: container process not found" containerID="c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a" cmd=["grpc_health_probe","-addr=:50051"] Nov 27 11:25:19 crc kubenswrapper[4807]: E1127 11:25:19.156871 4807 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-whcvm" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerName="registry-server" Nov 27 11:25:19 crc kubenswrapper[4807]: E1127 11:25:19.379771 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Nov 27 11:25:19 crc kubenswrapper[4807]: E1127 11:25:19.379966 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c8hbfh57bh668h687h676h64h64bh69h75h587h667h64fh679h5b8h7dh685h5b6h57dh6fh5f8h68ch5b5h678h59bh79h87h669h644h68dh679h7bq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2sqj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(e47ac50c-3a93-46fd-94f2-5c83e02e1919): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:25:19 crc kubenswrapper[4807]: E1127 11:25:19.623361 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Nov 27 11:25:19 crc kubenswrapper[4807]: E1127 11:25:19.624301 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n657h57dh58dh68fh547h68fh598h54ch5b7h5bfh58ch59h584h545h5ffh558h544h676h89h5d8hch8chb6hc9h55dh65fh555h678h66fh64bh5cdhdbq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5d6dc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-64nw4_openstack(356f01bb-6304-499b-946d-1e9f3d6e7572): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:25:19 crc kubenswrapper[4807]: E1127 11:25:19.625516 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-64nw4" podUID="356f01bb-6304-499b-946d-1e9f3d6e7572" Nov 27 11:25:19 crc kubenswrapper[4807]: I1127 11:25:19.793626 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:25:19 crc kubenswrapper[4807]: I1127 11:25:19.869541 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whcvm" event={"ID":"d44ca316-19b2-4a04-97c2-a77db9c711b6","Type":"ContainerDied","Data":"6737d37ea3c2d8e686b82ee958e3d0db7ef209f7b727bdbdd1ad4b2a2174734b"} Nov 27 11:25:19 crc kubenswrapper[4807]: I1127 11:25:19.869595 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whcvm" Nov 27 11:25:19 crc kubenswrapper[4807]: I1127 11:25:19.869601 4807 scope.go:117] "RemoveContainer" containerID="c82cc1079855cbab4e2d3b0a3761b9408a86c559431a1b67fc39daf08191937a" Nov 27 11:25:19 crc kubenswrapper[4807]: E1127 11:25:19.872407 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-64nw4" podUID="356f01bb-6304-499b-946d-1e9f3d6e7572" Nov 27 11:25:19 crc kubenswrapper[4807]: I1127 11:25:19.947644 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4gjz\" (UniqueName: \"kubernetes.io/projected/d44ca316-19b2-4a04-97c2-a77db9c711b6-kube-api-access-s4gjz\") pod \"d44ca316-19b2-4a04-97c2-a77db9c711b6\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " Nov 27 11:25:19 crc kubenswrapper[4807]: I1127 11:25:19.948021 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-catalog-content\") pod \"d44ca316-19b2-4a04-97c2-a77db9c711b6\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " Nov 27 11:25:19 crc kubenswrapper[4807]: I1127 11:25:19.948048 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-utilities\") pod \"d44ca316-19b2-4a04-97c2-a77db9c711b6\" (UID: \"d44ca316-19b2-4a04-97c2-a77db9c711b6\") " Nov 27 11:25:19 crc kubenswrapper[4807]: I1127 11:25:19.949099 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-utilities" (OuterVolumeSpecName: "utilities") pod "d44ca316-19b2-4a04-97c2-a77db9c711b6" (UID: "d44ca316-19b2-4a04-97c2-a77db9c711b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:25:19 crc kubenswrapper[4807]: I1127 11:25:19.953333 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d44ca316-19b2-4a04-97c2-a77db9c711b6-kube-api-access-s4gjz" (OuterVolumeSpecName: "kube-api-access-s4gjz") pod "d44ca316-19b2-4a04-97c2-a77db9c711b6" (UID: "d44ca316-19b2-4a04-97c2-a77db9c711b6"). InnerVolumeSpecName "kube-api-access-s4gjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.001654 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d44ca316-19b2-4a04-97c2-a77db9c711b6" (UID: "d44ca316-19b2-4a04-97c2-a77db9c711b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.052662 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4gjz\" (UniqueName: \"kubernetes.io/projected/d44ca316-19b2-4a04-97c2-a77db9c711b6-kube-api-access-s4gjz\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.052712 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.052721 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d44ca316-19b2-4a04-97c2-a77db9c711b6-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.162869 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.222767 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whcvm"] Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.228830 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-whcvm"] Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.256825 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd29m\" (UniqueName: \"kubernetes.io/projected/b5e4033d-b980-406c-b771-080c120e6ee4-kube-api-access-wd29m\") pod \"b5e4033d-b980-406c-b771-080c120e6ee4\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.256982 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-config\") pod \"b5e4033d-b980-406c-b771-080c120e6ee4\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.257130 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-dns-svc\") pod \"b5e4033d-b980-406c-b771-080c120e6ee4\" (UID: \"b5e4033d-b980-406c-b771-080c120e6ee4\") " Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.263947 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e4033d-b980-406c-b771-080c120e6ee4-kube-api-access-wd29m" (OuterVolumeSpecName: "kube-api-access-wd29m") pod "b5e4033d-b980-406c-b771-080c120e6ee4" (UID: "b5e4033d-b980-406c-b771-080c120e6ee4"). InnerVolumeSpecName "kube-api-access-wd29m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.298991 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-config" (OuterVolumeSpecName: "config") pod "b5e4033d-b980-406c-b771-080c120e6ee4" (UID: "b5e4033d-b980-406c-b771-080c120e6ee4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.301140 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5e4033d-b980-406c-b771-080c120e6ee4" (UID: "b5e4033d-b980-406c-b771-080c120e6ee4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.308929 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9gjp"] Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.358348 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.358410 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5e4033d-b980-406c-b771-080c120e6ee4-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.358425 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd29m\" (UniqueName: \"kubernetes.io/projected/b5e4033d-b980-406c-b771-080c120e6ee4-kube-api-access-wd29m\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.524198 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6pfcf"] Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.529646 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlflh"] Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.877774 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" event={"ID":"b5e4033d-b980-406c-b771-080c120e6ee4","Type":"ContainerDied","Data":"242c89baa5a7096dcf090b61f531a2f1c45ba6b14198fac6af3f4604607a5c32"} Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.878058 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-2f54p" Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.879434 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"122d837d-ee30-4e26-9e01-1f4bd8ebaace","Type":"ContainerStarted","Data":"3f0e4cc78570d38a27455b7f5fb82cb56419cee077a6bdabccb9293223ea80b6"} Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.921324 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2f54p"] Nov 27 11:25:20 crc kubenswrapper[4807]: I1127 11:25:20.926560 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-2f54p"] Nov 27 11:25:21 crc kubenswrapper[4807]: I1127 11:25:21.616158 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e4033d-b980-406c-b771-080c120e6ee4" path="/var/lib/kubelet/pods/b5e4033d-b980-406c-b771-080c120e6ee4/volumes" Nov 27 11:25:21 crc kubenswrapper[4807]: I1127 11:25:21.666491 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" path="/var/lib/kubelet/pods/d44ca316-19b2-4a04-97c2-a77db9c711b6/volumes" Nov 27 11:25:21 crc kubenswrapper[4807]: I1127 11:25:21.852283 4807 scope.go:117] "RemoveContainer" containerID="310ef764322c05b20f640d5bc2e4ea0e3e25f44a5ecb92df7212a44c0e372585" Nov 27 11:25:21 crc kubenswrapper[4807]: I1127 11:25:21.909946 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9gjp" event={"ID":"72480ac5-37a1-414b-b364-4290d9525ddb","Type":"ContainerStarted","Data":"3182558dd1f08f59e878f3f3358c90ac8eb03b9b08ccf0d45d593e641ef8fde9"} Nov 27 11:25:21 crc kubenswrapper[4807]: I1127 11:25:21.912497 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6pfcf" event={"ID":"186b6a8f-d303-440b-99ea-6502bac3e583","Type":"ContainerStarted","Data":"78339fbd295a705fe9900e49c2eb22be6bf82c7ec45f4d296aa2f3158f5d30b1"} Nov 27 11:25:21 crc kubenswrapper[4807]: I1127 11:25:21.914280 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" event={"ID":"6ecc155f-d98b-4a43-bf52-c58e0bf367f2","Type":"ContainerStarted","Data":"8519c067211758c60f53649f5111254dfe1aa6d6f35df5b68c807aed9b3a9932"} Nov 27 11:25:22 crc kubenswrapper[4807]: E1127 11:25:22.072620 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Nov 27 11:25:22 crc kubenswrapper[4807]: E1127 11:25:22.072677 4807 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Nov 27 11:25:22 crc kubenswrapper[4807]: E1127 11:25:22.072832 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pcr4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(8cfd9070-b1bd-4a24-b694-ae4ff059ec1c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 27 11:25:22 crc kubenswrapper[4807]: E1127 11:25:22.074386 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="8cfd9070-b1bd-4a24-b694-ae4ff059ec1c" Nov 27 11:25:22 crc kubenswrapper[4807]: I1127 11:25:22.923711 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6603c2ee-9ab6-476c-8db6-d073f0dec3aa","Type":"ContainerStarted","Data":"91178cdf463d83db0c138432515a8a2a62e30b75071ee8dbf0267a4385aa68bd"} Nov 27 11:25:22 crc kubenswrapper[4807]: I1127 11:25:22.925001 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8ecce491-4a06-4922-8353-0586ac99471b","Type":"ContainerStarted","Data":"4c3bb6910f93e592bbc1c66032965d27b238add787a84b6e32489d458425f01d"} Nov 27 11:25:22 crc kubenswrapper[4807]: I1127 11:25:22.925137 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 27 11:25:22 crc kubenswrapper[4807]: E1127 11:25:22.926375 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="8cfd9070-b1bd-4a24-b694-ae4ff059ec1c" Nov 27 11:25:22 crc kubenswrapper[4807]: I1127 11:25:22.969534 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.228807825 podStartE2EDuration="36.969518947s" podCreationTimestamp="2025-11-27 11:24:46 +0000 UTC" firstStartedPulling="2025-11-27 11:24:58.562231275 +0000 UTC m=+939.661729483" lastFinishedPulling="2025-11-27 11:25:19.302942367 +0000 UTC m=+960.402440605" observedRunningTime="2025-11-27 11:25:22.967668708 +0000 UTC m=+964.067166906" watchObservedRunningTime="2025-11-27 11:25:22.969518947 +0000 UTC m=+964.069017145" Nov 27 11:25:23 crc kubenswrapper[4807]: I1127 11:25:23.278825 4807 scope.go:117] "RemoveContainer" containerID="a2b1a8e4702185dee3f46f98d2b08caa1786f54dd2c9c9cb29036d406991c9db" Nov 27 11:25:23 crc kubenswrapper[4807]: I1127 11:25:23.550481 4807 scope.go:117] "RemoveContainer" containerID="77b4f1ddc63c72b0deb7d2650e634467a0363051378d934a13c4b03db57aa0cf" Nov 27 11:25:23 crc kubenswrapper[4807]: I1127 11:25:23.592122 4807 scope.go:117] "RemoveContainer" containerID="9bcaf7a884dea0a3cb2589d64b0013a579cce8fa382cd962475d2d7394a07ee4" Nov 27 11:25:23 crc kubenswrapper[4807]: I1127 11:25:23.933551 4807 generic.go:334] "Generic (PLEG): container finished" podID="9bac140b-1bac-4d27-bb66-111e66af1edf" containerID="d28f3744d905e2baea996de34627dcb61662979270676f520f4558f63b9aec42" exitCode=0 Nov 27 11:25:23 crc kubenswrapper[4807]: I1127 11:25:23.933612 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-26rzj" event={"ID":"9bac140b-1bac-4d27-bb66-111e66af1edf","Type":"ContainerDied","Data":"d28f3744d905e2baea996de34627dcb61662979270676f520f4558f63b9aec42"} Nov 27 11:25:23 crc kubenswrapper[4807]: I1127 11:25:23.937779 4807 generic.go:334] "Generic (PLEG): container finished" podID="6ecc155f-d98b-4a43-bf52-c58e0bf367f2" containerID="923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8" exitCode=0 Nov 27 11:25:23 crc kubenswrapper[4807]: I1127 11:25:23.937858 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" event={"ID":"6ecc155f-d98b-4a43-bf52-c58e0bf367f2","Type":"ContainerDied","Data":"923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8"} Nov 27 11:25:23 crc kubenswrapper[4807]: I1127 11:25:23.943182 4807 generic.go:334] "Generic (PLEG): container finished" podID="72480ac5-37a1-414b-b364-4290d9525ddb" containerID="8f5660994310a44a62c5df25530684460af218339f36a9e897959d3e495db9d9" exitCode=0 Nov 27 11:25:23 crc kubenswrapper[4807]: I1127 11:25:23.943761 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9gjp" event={"ID":"72480ac5-37a1-414b-b364-4290d9525ddb","Type":"ContainerDied","Data":"8f5660994310a44a62c5df25530684460af218339f36a9e897959d3e495db9d9"} Nov 27 11:25:24 crc kubenswrapper[4807]: E1127 11:25:24.545032 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="0a8b97df-a50b-4cce-8035-28b23cbdaf72" Nov 27 11:25:24 crc kubenswrapper[4807]: I1127 11:25:24.951409 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-26rzj" event={"ID":"9bac140b-1bac-4d27-bb66-111e66af1edf","Type":"ContainerStarted","Data":"6133fb2ee1dbd6d51285473bf5f7fef44e8317d19a0e0df9a571966cfaa5894b"} Nov 27 11:25:24 crc kubenswrapper[4807]: I1127 11:25:24.953595 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" event={"ID":"6ecc155f-d98b-4a43-bf52-c58e0bf367f2","Type":"ContainerStarted","Data":"71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c"} Nov 27 11:25:24 crc kubenswrapper[4807]: I1127 11:25:24.953713 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:24 crc kubenswrapper[4807]: I1127 11:25:24.960591 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0a8b97df-a50b-4cce-8035-28b23cbdaf72","Type":"ContainerStarted","Data":"c9aa032e9fcf59bedc01fddfba6b689c78e938d8c45c380e9ee385bd3f686708"} Nov 27 11:25:24 crc kubenswrapper[4807]: E1127 11:25:24.964345 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="0a8b97df-a50b-4cce-8035-28b23cbdaf72" Nov 27 11:25:24 crc kubenswrapper[4807]: I1127 11:25:24.976952 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" podStartSLOduration=9.976933572 podStartE2EDuration="9.976933572s" podCreationTimestamp="2025-11-27 11:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:25:24.970325797 +0000 UTC m=+966.069824015" watchObservedRunningTime="2025-11-27 11:25:24.976933572 +0000 UTC m=+966.076431760" Nov 27 11:25:25 crc kubenswrapper[4807]: E1127 11:25:25.029921 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="e47ac50c-3a93-46fd-94f2-5c83e02e1919" Nov 27 11:25:25 crc kubenswrapper[4807]: I1127 11:25:25.987678 4807 generic.go:334] "Generic (PLEG): container finished" podID="72480ac5-37a1-414b-b364-4290d9525ddb" containerID="9e873195553453996426f105a171778d52c29988659cb9dc3526be8575f97ff7" exitCode=0 Nov 27 11:25:25 crc kubenswrapper[4807]: I1127 11:25:25.987758 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9gjp" event={"ID":"72480ac5-37a1-414b-b364-4290d9525ddb","Type":"ContainerDied","Data":"9e873195553453996426f105a171778d52c29988659cb9dc3526be8575f97ff7"} Nov 27 11:25:26 crc kubenswrapper[4807]: I1127 11:25:26.000923 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6pfcf" event={"ID":"186b6a8f-d303-440b-99ea-6502bac3e583","Type":"ContainerStarted","Data":"854b6aeb330b00e2fe0ebd59a752e0e0208a56d2e57e652b04cb69279d64bb46"} Nov 27 11:25:26 crc kubenswrapper[4807]: I1127 11:25:26.002983 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e47ac50c-3a93-46fd-94f2-5c83e02e1919","Type":"ContainerStarted","Data":"bc608110d9c2bed0317a7cfc5908ec8e6e205d5b763e3a80939b39ce7be7d9d8"} Nov 27 11:25:26 crc kubenswrapper[4807]: E1127 11:25:26.005448 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="e47ac50c-3a93-46fd-94f2-5c83e02e1919" Nov 27 11:25:26 crc kubenswrapper[4807]: I1127 11:25:26.008111 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-26rzj" event={"ID":"9bac140b-1bac-4d27-bb66-111e66af1edf","Type":"ContainerStarted","Data":"02dbc868bbea8ebd8e349cb392a0a646e321ea9f199e9ff9f1092e09a0d5ba3d"} Nov 27 11:25:26 crc kubenswrapper[4807]: I1127 11:25:26.008154 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:25:26 crc kubenswrapper[4807]: I1127 11:25:26.008362 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:25:26 crc kubenswrapper[4807]: E1127 11:25:26.010492 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="0a8b97df-a50b-4cce-8035-28b23cbdaf72" Nov 27 11:25:26 crc kubenswrapper[4807]: I1127 11:25:26.068881 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-26rzj" podStartSLOduration=15.921644357 podStartE2EDuration="34.068861164s" podCreationTimestamp="2025-11-27 11:24:52 +0000 UTC" firstStartedPulling="2025-11-27 11:25:01.733952094 +0000 UTC m=+942.833450312" lastFinishedPulling="2025-11-27 11:25:19.881168911 +0000 UTC m=+960.980667119" observedRunningTime="2025-11-27 11:25:26.058895421 +0000 UTC m=+967.158393619" watchObservedRunningTime="2025-11-27 11:25:26.068861164 +0000 UTC m=+967.168359362" Nov 27 11:25:27 crc kubenswrapper[4807]: E1127 11:25:27.017161 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-sb-0" podUID="e47ac50c-3a93-46fd-94f2-5c83e02e1919" Nov 27 11:25:27 crc kubenswrapper[4807]: I1127 11:25:27.039326 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6pfcf" podStartSLOduration=9.106718946 podStartE2EDuration="12.039292678s" podCreationTimestamp="2025-11-27 11:25:15 +0000 UTC" firstStartedPulling="2025-11-27 11:25:21.86719405 +0000 UTC m=+962.966692248" lastFinishedPulling="2025-11-27 11:25:24.799767782 +0000 UTC m=+965.899265980" observedRunningTime="2025-11-27 11:25:26.099768821 +0000 UTC m=+967.199267039" watchObservedRunningTime="2025-11-27 11:25:27.039292678 +0000 UTC m=+968.138791136" Nov 27 11:25:30 crc kubenswrapper[4807]: I1127 11:25:30.971821 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:31 crc kubenswrapper[4807]: I1127 11:25:31.038432 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jm8h6"] Nov 27 11:25:31 crc kubenswrapper[4807]: I1127 11:25:31.038657 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" podUID="b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" containerName="dnsmasq-dns" containerID="cri-o://3f4d05167779ed156fdf7b189cf31b61502f240a4cae0121613e05a0b9c2073a" gracePeriod=10 Nov 27 11:25:31 crc kubenswrapper[4807]: I1127 11:25:31.612414 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.051575 4807 generic.go:334] "Generic (PLEG): container finished" podID="b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" containerID="3f4d05167779ed156fdf7b189cf31b61502f240a4cae0121613e05a0b9c2073a" exitCode=0 Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.051664 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" event={"ID":"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c","Type":"ContainerDied","Data":"3f4d05167779ed156fdf7b189cf31b61502f240a4cae0121613e05a0b9c2073a"} Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.053358 4807 generic.go:334] "Generic (PLEG): container finished" podID="b811158c-3b16-415b-95df-baba9483d782" containerID="9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b" exitCode=0 Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.053433 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b811158c-3b16-415b-95df-baba9483d782","Type":"ContainerDied","Data":"9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b"} Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.054798 4807 generic.go:334] "Generic (PLEG): container finished" podID="6603c2ee-9ab6-476c-8db6-d073f0dec3aa" containerID="91178cdf463d83db0c138432515a8a2a62e30b75071ee8dbf0267a4385aa68bd" exitCode=0 Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.054927 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6603c2ee-9ab6-476c-8db6-d073f0dec3aa","Type":"ContainerDied","Data":"91178cdf463d83db0c138432515a8a2a62e30b75071ee8dbf0267a4385aa68bd"} Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.058000 4807 generic.go:334] "Generic (PLEG): container finished" podID="e153e04c-cadb-4d8a-9863-9ef60eac08e9" containerID="83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5" exitCode=0 Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.058055 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e153e04c-cadb-4d8a-9863-9ef60eac08e9","Type":"ContainerDied","Data":"83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5"} Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.061202 4807 generic.go:334] "Generic (PLEG): container finished" podID="122d837d-ee30-4e26-9e01-1f4bd8ebaace" containerID="3f0e4cc78570d38a27455b7f5fb82cb56419cee077a6bdabccb9293223ea80b6" exitCode=0 Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.061236 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"122d837d-ee30-4e26-9e01-1f4bd8ebaace","Type":"ContainerDied","Data":"3f0e4cc78570d38a27455b7f5fb82cb56419cee077a6bdabccb9293223ea80b6"} Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.199146 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" podUID="b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.95:5353: connect: connection refused" Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.623909 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.667725 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvjnk\" (UniqueName: \"kubernetes.io/projected/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-kube-api-access-dvjnk\") pod \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.667818 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-config\") pod \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.667973 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-dns-svc\") pod \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\" (UID: \"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c\") " Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.674283 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-kube-api-access-dvjnk" (OuterVolumeSpecName: "kube-api-access-dvjnk") pod "b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" (UID: "b9a3b2c2-7e2b-4d86-969a-3abba7d4773c"). InnerVolumeSpecName "kube-api-access-dvjnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.713074 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" (UID: "b9a3b2c2-7e2b-4d86-969a-3abba7d4773c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.721916 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-config" (OuterVolumeSpecName: "config") pod "b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" (UID: "b9a3b2c2-7e2b-4d86-969a-3abba7d4773c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.769388 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.769427 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvjnk\" (UniqueName: \"kubernetes.io/projected/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-kube-api-access-dvjnk\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:32 crc kubenswrapper[4807]: I1127 11:25:32.769438 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.069613 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e153e04c-cadb-4d8a-9863-9ef60eac08e9","Type":"ContainerStarted","Data":"d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c"} Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.070741 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.072340 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv2x5" event={"ID":"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4","Type":"ContainerStarted","Data":"e1606a2b9405a77c3b526853313b33e9b9e5e61dde8b9b65b924579b0abcf345"} Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.074955 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"122d837d-ee30-4e26-9e01-1f4bd8ebaace","Type":"ContainerStarted","Data":"23da0c064b308ee630756c2fcf020f4c80db4df6e2d4818c6d8179f582f8365d"} Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.076993 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9gjp" event={"ID":"72480ac5-37a1-414b-b364-4290d9525ddb","Type":"ContainerStarted","Data":"1c89d26302abeea6f5c01f3f59b2b283336e6db0bea1e8fce958bc01cecd9025"} Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.078358 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.078778 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-jm8h6" event={"ID":"b9a3b2c2-7e2b-4d86-969a-3abba7d4773c","Type":"ContainerDied","Data":"70daac6db7cb698d58d490aa69134fa37b4e76c4ac1c0115c33e04f2a3bb9782"} Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.078956 4807 scope.go:117] "RemoveContainer" containerID="3f4d05167779ed156fdf7b189cf31b61502f240a4cae0121613e05a0b9c2073a" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.081686 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b811158c-3b16-415b-95df-baba9483d782","Type":"ContainerStarted","Data":"c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212"} Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.082182 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.087777 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6603c2ee-9ab6-476c-8db6-d073f0dec3aa","Type":"ContainerStarted","Data":"a9712719b1d0e6fa0850e2ba27ba9e9ce65d38fad7e42616919f609624864c20"} Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.097666 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.481011773 podStartE2EDuration="52.097653441s" podCreationTimestamp="2025-11-27 11:24:41 +0000 UTC" firstStartedPulling="2025-11-27 11:24:43.521049053 +0000 UTC m=+924.620547251" lastFinishedPulling="2025-11-27 11:24:58.137690721 +0000 UTC m=+939.237188919" observedRunningTime="2025-11-27 11:25:33.092759601 +0000 UTC m=+974.192257809" watchObservedRunningTime="2025-11-27 11:25:33.097653441 +0000 UTC m=+974.197151639" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.098969 4807 scope.go:117] "RemoveContainer" containerID="d02eddd35d46dadeec18c4a0a36460fa3fd3f8ca50dc6de477b4904951b50a56" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.121071 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.815470588 podStartE2EDuration="52.121054159s" podCreationTimestamp="2025-11-27 11:24:41 +0000 UTC" firstStartedPulling="2025-11-27 11:24:43.739641407 +0000 UTC m=+924.839139605" lastFinishedPulling="2025-11-27 11:24:58.045224978 +0000 UTC m=+939.144723176" observedRunningTime="2025-11-27 11:25:33.116893269 +0000 UTC m=+974.216391477" watchObservedRunningTime="2025-11-27 11:25:33.121054159 +0000 UTC m=+974.220552357" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.142505 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=44.53051909 podStartE2EDuration="50.142489766s" podCreationTimestamp="2025-11-27 11:24:43 +0000 UTC" firstStartedPulling="2025-11-27 11:24:58.308498473 +0000 UTC m=+939.407996671" lastFinishedPulling="2025-11-27 11:25:03.920469149 +0000 UTC m=+945.019967347" observedRunningTime="2025-11-27 11:25:33.141074569 +0000 UTC m=+974.240572767" watchObservedRunningTime="2025-11-27 11:25:33.142489766 +0000 UTC m=+974.241987964" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.185762 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w9gjp" podStartSLOduration=20.754723183 podStartE2EDuration="29.18573933s" podCreationTimestamp="2025-11-27 11:25:04 +0000 UTC" firstStartedPulling="2025-11-27 11:25:23.973376264 +0000 UTC m=+965.072874472" lastFinishedPulling="2025-11-27 11:25:32.404392421 +0000 UTC m=+973.503890619" observedRunningTime="2025-11-27 11:25:33.183213023 +0000 UTC m=+974.282711251" watchObservedRunningTime="2025-11-27 11:25:33.18573933 +0000 UTC m=+974.285237528" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.215684 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=41.542957351 podStartE2EDuration="49.215662821s" podCreationTimestamp="2025-11-27 11:24:44 +0000 UTC" firstStartedPulling="2025-11-27 11:24:58.53781159 +0000 UTC m=+939.637309788" lastFinishedPulling="2025-11-27 11:25:06.21051706 +0000 UTC m=+947.310015258" observedRunningTime="2025-11-27 11:25:33.207345671 +0000 UTC m=+974.306843879" watchObservedRunningTime="2025-11-27 11:25:33.215662821 +0000 UTC m=+974.315161019" Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.227578 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jm8h6"] Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.233423 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-jm8h6"] Nov 27 11:25:33 crc kubenswrapper[4807]: I1127 11:25:33.542186 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" path="/var/lib/kubelet/pods/b9a3b2c2-7e2b-4d86-969a-3abba7d4773c/volumes" Nov 27 11:25:34 crc kubenswrapper[4807]: I1127 11:25:34.098354 4807 generic.go:334] "Generic (PLEG): container finished" podID="c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4" containerID="e1606a2b9405a77c3b526853313b33e9b9e5e61dde8b9b65b924579b0abcf345" exitCode=0 Nov 27 11:25:34 crc kubenswrapper[4807]: I1127 11:25:34.098434 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv2x5" event={"ID":"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4","Type":"ContainerDied","Data":"e1606a2b9405a77c3b526853313b33e9b9e5e61dde8b9b65b924579b0abcf345"} Nov 27 11:25:34 crc kubenswrapper[4807]: I1127 11:25:34.596651 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:34 crc kubenswrapper[4807]: I1127 11:25:34.596721 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:34 crc kubenswrapper[4807]: I1127 11:25:34.644175 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:34 crc kubenswrapper[4807]: I1127 11:25:34.725289 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 27 11:25:34 crc kubenswrapper[4807]: I1127 11:25:34.725336 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 27 11:25:35 crc kubenswrapper[4807]: I1127 11:25:35.108679 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wv2x5" event={"ID":"c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4","Type":"ContainerStarted","Data":"ced0c9dbf6d62b1a3f6cb943b1566c87b44601b9d74fb7d2570227a00a191dc8"} Nov 27 11:25:35 crc kubenswrapper[4807]: I1127 11:25:35.112161 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8cfd9070-b1bd-4a24-b694-ae4ff059ec1c","Type":"ContainerStarted","Data":"0e65cdd448122001009f7cd9fc198fd59b27840371789a1c62bddb5984f32655"} Nov 27 11:25:35 crc kubenswrapper[4807]: I1127 11:25:35.112415 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 27 11:25:35 crc kubenswrapper[4807]: I1127 11:25:35.133070 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wv2x5" podStartSLOduration=14.152016048 podStartE2EDuration="47.133049714s" podCreationTimestamp="2025-11-27 11:24:48 +0000 UTC" firstStartedPulling="2025-11-27 11:25:01.725657815 +0000 UTC m=+942.825156013" lastFinishedPulling="2025-11-27 11:25:34.706691481 +0000 UTC m=+975.806189679" observedRunningTime="2025-11-27 11:25:35.128613927 +0000 UTC m=+976.228112115" watchObservedRunningTime="2025-11-27 11:25:35.133049714 +0000 UTC m=+976.232547912" Nov 27 11:25:35 crc kubenswrapper[4807]: I1127 11:25:35.160523 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.763661112 podStartE2EDuration="48.16050246s" podCreationTimestamp="2025-11-27 11:24:47 +0000 UTC" firstStartedPulling="2025-11-27 11:24:58.531204776 +0000 UTC m=+939.630702974" lastFinishedPulling="2025-11-27 11:25:34.928046124 +0000 UTC m=+976.027544322" observedRunningTime="2025-11-27 11:25:35.153178306 +0000 UTC m=+976.252676524" watchObservedRunningTime="2025-11-27 11:25:35.16050246 +0000 UTC m=+976.260000658" Nov 27 11:25:36 crc kubenswrapper[4807]: I1127 11:25:36.061946 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 27 11:25:36 crc kubenswrapper[4807]: I1127 11:25:36.062623 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 27 11:25:36 crc kubenswrapper[4807]: I1127 11:25:36.121132 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-64nw4" event={"ID":"356f01bb-6304-499b-946d-1e9f3d6e7572","Type":"ContainerStarted","Data":"751b34395c1807ae5549b3b41c0556f701a516b4ce02c1bc41d2ff2f826a3c4b"} Nov 27 11:25:36 crc kubenswrapper[4807]: I1127 11:25:36.122514 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-64nw4" Nov 27 11:25:36 crc kubenswrapper[4807]: I1127 11:25:36.136937 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 27 11:25:36 crc kubenswrapper[4807]: I1127 11:25:36.138782 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-64nw4" podStartSLOduration=8.047218324 podStartE2EDuration="44.138764533s" podCreationTimestamp="2025-11-27 11:24:52 +0000 UTC" firstStartedPulling="2025-11-27 11:24:59.05039578 +0000 UTC m=+940.149893978" lastFinishedPulling="2025-11-27 11:25:35.141941989 +0000 UTC m=+976.241440187" observedRunningTime="2025-11-27 11:25:36.138372113 +0000 UTC m=+977.237870331" watchObservedRunningTime="2025-11-27 11:25:36.138764533 +0000 UTC m=+977.238262731" Nov 27 11:25:37 crc kubenswrapper[4807]: I1127 11:25:37.214349 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 27 11:25:37 crc kubenswrapper[4807]: I1127 11:25:37.355051 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 27 11:25:37 crc kubenswrapper[4807]: I1127 11:25:37.424850 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.427969 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-rtrnf"] Nov 27 11:25:38 crc kubenswrapper[4807]: E1127 11:25:38.428654 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" containerName="init" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.428670 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" containerName="init" Nov 27 11:25:38 crc kubenswrapper[4807]: E1127 11:25:38.428693 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerName="extract-utilities" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.428701 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerName="extract-utilities" Nov 27 11:25:38 crc kubenswrapper[4807]: E1127 11:25:38.428714 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerName="extract-content" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.428722 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerName="extract-content" Nov 27 11:25:38 crc kubenswrapper[4807]: E1127 11:25:38.428736 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" containerName="dnsmasq-dns" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.428743 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" containerName="dnsmasq-dns" Nov 27 11:25:38 crc kubenswrapper[4807]: E1127 11:25:38.428754 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e4033d-b980-406c-b771-080c120e6ee4" containerName="init" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.428761 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e4033d-b980-406c-b771-080c120e6ee4" containerName="init" Nov 27 11:25:38 crc kubenswrapper[4807]: E1127 11:25:38.428772 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerName="registry-server" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.428778 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerName="registry-server" Nov 27 11:25:38 crc kubenswrapper[4807]: E1127 11:25:38.428796 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e4033d-b980-406c-b771-080c120e6ee4" containerName="dnsmasq-dns" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.428802 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e4033d-b980-406c-b771-080c120e6ee4" containerName="dnsmasq-dns" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.428947 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e4033d-b980-406c-b771-080c120e6ee4" containerName="dnsmasq-dns" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.428968 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9a3b2c2-7e2b-4d86-969a-3abba7d4773c" containerName="dnsmasq-dns" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.428979 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d44ca316-19b2-4a04-97c2-a77db9c711b6" containerName="registry-server" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.429911 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.452323 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rtrnf"] Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.452990 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-dns-svc\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.453018 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6hcb\" (UniqueName: \"kubernetes.io/projected/fcecd1ef-65f5-47b8-8d41-55b3da46db65-kube-api-access-l6hcb\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.453089 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-config\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.453103 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.453170 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.554369 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-config\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.554410 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.554513 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.554537 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-dns-svc\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.554553 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6hcb\" (UniqueName: \"kubernetes.io/projected/fcecd1ef-65f5-47b8-8d41-55b3da46db65-kube-api-access-l6hcb\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.555973 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-config\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.556190 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.556546 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.557030 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-dns-svc\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.589414 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6hcb\" (UniqueName: \"kubernetes.io/projected/fcecd1ef-65f5-47b8-8d41-55b3da46db65-kube-api-access-l6hcb\") pod \"dnsmasq-dns-698758b865-rtrnf\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:38 crc kubenswrapper[4807]: I1127 11:25:38.745879 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.146963 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"0a8b97df-a50b-4cce-8035-28b23cbdaf72","Type":"ContainerStarted","Data":"4ad0da4ad7edad966cb825f9db743fb2e175d6e957c35fd57a4d90712f80f6fb"} Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.149911 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.150495 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.153722 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"e47ac50c-3a93-46fd-94f2-5c83e02e1919","Type":"ContainerStarted","Data":"e051685c7357041f568447a184e7b86f7e16d88aed70d445992636528ce010ae"} Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.188516 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.107752156 podStartE2EDuration="45.188496324s" podCreationTimestamp="2025-11-27 11:24:54 +0000 UTC" firstStartedPulling="2025-11-27 11:24:58.967175581 +0000 UTC m=+940.066673779" lastFinishedPulling="2025-11-27 11:25:38.047919749 +0000 UTC m=+979.147417947" observedRunningTime="2025-11-27 11:25:39.175621644 +0000 UTC m=+980.275119842" watchObservedRunningTime="2025-11-27 11:25:39.188496324 +0000 UTC m=+980.287994522" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.193731 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rtrnf"] Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.194353 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:25:39 crc kubenswrapper[4807]: W1127 11:25:39.195824 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcecd1ef_65f5_47b8_8d41_55b3da46db65.slice/crio-6a9616cca75debbe4574a723eab851d9f616620975788cdebc4b96bbd79c6321 WatchSource:0}: Error finding container 6a9616cca75debbe4574a723eab851d9f616620975788cdebc4b96bbd79c6321: Status 404 returned error can't find the container with id 6a9616cca75debbe4574a723eab851d9f616620975788cdebc4b96bbd79c6321 Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.204691 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.300444456 podStartE2EDuration="47.204669812s" podCreationTimestamp="2025-11-27 11:24:52 +0000 UTC" firstStartedPulling="2025-11-27 11:24:59.141851115 +0000 UTC m=+940.241349313" lastFinishedPulling="2025-11-27 11:25:38.046076471 +0000 UTC m=+979.145574669" observedRunningTime="2025-11-27 11:25:39.197841661 +0000 UTC m=+980.297339869" watchObservedRunningTime="2025-11-27 11:25:39.204669812 +0000 UTC m=+980.304168010" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.571905 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.579864 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.583491 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.583663 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.584404 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.586202 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-c48bj" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.604870 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.769865 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.769927 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-lock\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.769947 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7fbd\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-kube-api-access-n7fbd\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.769964 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-cache\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.769994 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.871340 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.871431 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-lock\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.871461 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7fbd\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-kube-api-access-n7fbd\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.871488 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-cache\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.871522 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.871828 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.872691 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-lock\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.873290 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-cache\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: E1127 11:25:39.873399 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 11:25:39 crc kubenswrapper[4807]: E1127 11:25:39.873419 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 11:25:39 crc kubenswrapper[4807]: E1127 11:25:39.873545 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift podName:bc29fb6b-2886-4d51-8afd-be8fc1069ee4 nodeName:}" failed. No retries permitted until 2025-11-27 11:25:40.373526436 +0000 UTC m=+981.473024634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift") pod "swift-storage-0" (UID: "bc29fb6b-2886-4d51-8afd-be8fc1069ee4") : configmap "swift-ring-files" not found Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.893610 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7fbd\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-kube-api-access-n7fbd\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:39 crc kubenswrapper[4807]: I1127 11:25:39.894587 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.082030 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-6hg29"] Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.082956 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.084999 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.085014 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.085239 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.105038 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6hg29"] Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.168209 4807 generic.go:334] "Generic (PLEG): container finished" podID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerID="03c8dac7e13a41a25540dcc2719f59c8357ca6d74e4872a6390a02c3327452a8" exitCode=0 Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.169582 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rtrnf" event={"ID":"fcecd1ef-65f5-47b8-8d41-55b3da46db65","Type":"ContainerDied","Data":"03c8dac7e13a41a25540dcc2719f59c8357ca6d74e4872a6390a02c3327452a8"} Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.169626 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rtrnf" event={"ID":"fcecd1ef-65f5-47b8-8d41-55b3da46db65","Type":"ContainerStarted","Data":"6a9616cca75debbe4574a723eab851d9f616620975788cdebc4b96bbd79c6321"} Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.221297 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wv2x5" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.277911 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b83dff2f-801e-4a9b-9427-48e1f51bcc79-etc-swift\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.278071 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-ring-data-devices\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.278110 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-448ng\" (UniqueName: \"kubernetes.io/projected/b83dff2f-801e-4a9b-9427-48e1f51bcc79-kube-api-access-448ng\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.278200 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-combined-ca-bundle\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.278261 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-swiftconf\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.278322 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-scripts\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.278346 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-dispersionconf\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.313971 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wv2x5"] Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.379532 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b83dff2f-801e-4a9b-9427-48e1f51bcc79-etc-swift\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.379939 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-ring-data-devices\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.379972 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-448ng\" (UniqueName: \"kubernetes.io/projected/b83dff2f-801e-4a9b-9427-48e1f51bcc79-kube-api-access-448ng\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.380005 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.380036 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-combined-ca-bundle\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.380036 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j2bls"] Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.380290 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j2bls" podUID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerName="registry-server" containerID="cri-o://a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662" gracePeriod=2 Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.380057 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-swiftconf\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.380494 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-scripts\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.380530 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-dispersionconf\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: E1127 11:25:40.380828 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 11:25:40 crc kubenswrapper[4807]: E1127 11:25:40.380851 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 11:25:40 crc kubenswrapper[4807]: E1127 11:25:40.380935 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift podName:bc29fb6b-2886-4d51-8afd-be8fc1069ee4 nodeName:}" failed. No retries permitted until 2025-11-27 11:25:41.380917001 +0000 UTC m=+982.480415269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift") pod "swift-storage-0" (UID: "bc29fb6b-2886-4d51-8afd-be8fc1069ee4") : configmap "swift-ring-files" not found Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.380939 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b83dff2f-801e-4a9b-9427-48e1f51bcc79-etc-swift\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.381383 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-ring-data-devices\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.382071 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-scripts\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.385728 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-combined-ca-bundle\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.386512 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-swiftconf\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.393298 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-dispersionconf\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.401928 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-448ng\" (UniqueName: \"kubernetes.io/projected/b83dff2f-801e-4a9b-9427-48e1f51bcc79-kube-api-access-448ng\") pod \"swift-ring-rebalance-6hg29\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.421816 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.807063 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.807378 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.895332 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6hg29"] Nov 27 11:25:40 crc kubenswrapper[4807]: W1127 11:25:40.903093 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb83dff2f_801e_4a9b_9427_48e1f51bcc79.slice/crio-0ed382d648f2dfe7500fa89fddca6caeb0bfba3032ff9f176a991ff5ee1e3654 WatchSource:0}: Error finding container 0ed382d648f2dfe7500fa89fddca6caeb0bfba3032ff9f176a991ff5ee1e3654: Status 404 returned error can't find the container with id 0ed382d648f2dfe7500fa89fddca6caeb0bfba3032ff9f176a991ff5ee1e3654 Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.909568 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.993458 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-utilities\") pod \"caec2dd0-c63a-4572-9511-5e5b3be487fb\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.993693 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-catalog-content\") pod \"caec2dd0-c63a-4572-9511-5e5b3be487fb\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " Nov 27 11:25:40 crc kubenswrapper[4807]: I1127 11:25:40.993755 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldllc\" (UniqueName: \"kubernetes.io/projected/caec2dd0-c63a-4572-9511-5e5b3be487fb-kube-api-access-ldllc\") pod \"caec2dd0-c63a-4572-9511-5e5b3be487fb\" (UID: \"caec2dd0-c63a-4572-9511-5e5b3be487fb\") " Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.003389 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-utilities" (OuterVolumeSpecName: "utilities") pod "caec2dd0-c63a-4572-9511-5e5b3be487fb" (UID: "caec2dd0-c63a-4572-9511-5e5b3be487fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.004808 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caec2dd0-c63a-4572-9511-5e5b3be487fb-kube-api-access-ldllc" (OuterVolumeSpecName: "kube-api-access-ldllc") pod "caec2dd0-c63a-4572-9511-5e5b3be487fb" (UID: "caec2dd0-c63a-4572-9511-5e5b3be487fb"). InnerVolumeSpecName "kube-api-access-ldllc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.059268 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "caec2dd0-c63a-4572-9511-5e5b3be487fb" (UID: "caec2dd0-c63a-4572-9511-5e5b3be487fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.095478 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.095819 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caec2dd0-c63a-4572-9511-5e5b3be487fb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.095837 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldllc\" (UniqueName: \"kubernetes.io/projected/caec2dd0-c63a-4572-9511-5e5b3be487fb-kube-api-access-ldllc\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.177536 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rtrnf" event={"ID":"fcecd1ef-65f5-47b8-8d41-55b3da46db65","Type":"ContainerStarted","Data":"7677a534b698f567bf3cf669b38eca8fd4eee4df46f5f7240e5db151c875f6af"} Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.177720 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.180106 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6hg29" event={"ID":"b83dff2f-801e-4a9b-9427-48e1f51bcc79","Type":"ContainerStarted","Data":"0ed382d648f2dfe7500fa89fddca6caeb0bfba3032ff9f176a991ff5ee1e3654"} Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.182668 4807 generic.go:334] "Generic (PLEG): container finished" podID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerID="a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662" exitCode=0 Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.182735 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j2bls" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.182783 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2bls" event={"ID":"caec2dd0-c63a-4572-9511-5e5b3be487fb","Type":"ContainerDied","Data":"a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662"} Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.182845 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j2bls" event={"ID":"caec2dd0-c63a-4572-9511-5e5b3be487fb","Type":"ContainerDied","Data":"63eab6f0670377da29f765674e9ce16434a23d908068217cf62dd36b36eadebe"} Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.182871 4807 scope.go:117] "RemoveContainer" containerID="a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.203648 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-rtrnf" podStartSLOduration=3.203631373 podStartE2EDuration="3.203631373s" podCreationTimestamp="2025-11-27 11:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:25:41.195633311 +0000 UTC m=+982.295131519" watchObservedRunningTime="2025-11-27 11:25:41.203631373 +0000 UTC m=+982.303129571" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.211744 4807 scope.go:117] "RemoveContainer" containerID="bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.219808 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j2bls"] Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.228983 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j2bls"] Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.233877 4807 scope.go:117] "RemoveContainer" containerID="2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.264436 4807 scope.go:117] "RemoveContainer" containerID="a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662" Nov 27 11:25:41 crc kubenswrapper[4807]: E1127 11:25:41.266627 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662\": container with ID starting with a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662 not found: ID does not exist" containerID="a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.266667 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662"} err="failed to get container status \"a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662\": rpc error: code = NotFound desc = could not find container \"a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662\": container with ID starting with a4daf4586a9266b59b1f6c3228d4a33de6a93ff1da981eb0ffba75869bd7c662 not found: ID does not exist" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.266693 4807 scope.go:117] "RemoveContainer" containerID="bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e" Nov 27 11:25:41 crc kubenswrapper[4807]: E1127 11:25:41.267067 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e\": container with ID starting with bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e not found: ID does not exist" containerID="bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.267101 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e"} err="failed to get container status \"bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e\": rpc error: code = NotFound desc = could not find container \"bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e\": container with ID starting with bdafadc82c84f6d22aebbb736839ff3bcadad2a3d7efae12aa7f84d5f179440e not found: ID does not exist" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.267120 4807 scope.go:117] "RemoveContainer" containerID="2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a" Nov 27 11:25:41 crc kubenswrapper[4807]: E1127 11:25:41.267420 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a\": container with ID starting with 2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a not found: ID does not exist" containerID="2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.267471 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a"} err="failed to get container status \"2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a\": rpc error: code = NotFound desc = could not find container \"2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a\": container with ID starting with 2df0a9a38cbff88bfa5b0c178cc8ba91c3253cc36bf0ae1e1b9fc2d44419247a not found: ID does not exist" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.375992 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.401022 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:41 crc kubenswrapper[4807]: E1127 11:25:41.401235 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 11:25:41 crc kubenswrapper[4807]: E1127 11:25:41.401271 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 11:25:41 crc kubenswrapper[4807]: E1127 11:25:41.401321 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift podName:bc29fb6b-2886-4d51-8afd-be8fc1069ee4 nodeName:}" failed. No retries permitted until 2025-11-27 11:25:43.401306469 +0000 UTC m=+984.500804667 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift") pod "swift-storage-0" (UID: "bc29fb6b-2886-4d51-8afd-be8fc1069ee4") : configmap "swift-ring-files" not found Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.431318 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.542188 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caec2dd0-c63a-4572-9511-5e5b3be487fb" path="/var/lib/kubelet/pods/caec2dd0-c63a-4572-9511-5e5b3be487fb/volumes" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.861231 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9fmgz"] Nov 27 11:25:41 crc kubenswrapper[4807]: E1127 11:25:41.861668 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerName="extract-utilities" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.861684 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerName="extract-utilities" Nov 27 11:25:41 crc kubenswrapper[4807]: E1127 11:25:41.861699 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerName="extract-content" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.861706 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerName="extract-content" Nov 27 11:25:41 crc kubenswrapper[4807]: E1127 11:25:41.861739 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerName="registry-server" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.861746 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerName="registry-server" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.861911 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="caec2dd0-c63a-4572-9511-5e5b3be487fb" containerName="registry-server" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.862611 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9fmgz" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.874189 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-79e1-account-create-update-cjvzt"] Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.875368 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79e1-account-create-update-cjvzt" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.878132 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.903685 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9fmgz"] Nov 27 11:25:41 crc kubenswrapper[4807]: I1127 11:25:41.926815 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-79e1-account-create-update-cjvzt"] Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.042267 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-operator-scripts\") pod \"glance-79e1-account-create-update-cjvzt\" (UID: \"c0ca4c18-464b-4f06-8dee-dae2b044e9a5\") " pod="openstack/glance-79e1-account-create-update-cjvzt" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.042327 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d413ebfd-893d-437a-b034-72425fa40d8a-operator-scripts\") pod \"glance-db-create-9fmgz\" (UID: \"d413ebfd-893d-437a-b034-72425fa40d8a\") " pod="openstack/glance-db-create-9fmgz" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.042364 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-256bb\" (UniqueName: \"kubernetes.io/projected/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-kube-api-access-256bb\") pod \"glance-79e1-account-create-update-cjvzt\" (UID: \"c0ca4c18-464b-4f06-8dee-dae2b044e9a5\") " pod="openstack/glance-79e1-account-create-update-cjvzt" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.042423 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv9gb\" (UniqueName: \"kubernetes.io/projected/d413ebfd-893d-437a-b034-72425fa40d8a-kube-api-access-qv9gb\") pod \"glance-db-create-9fmgz\" (UID: \"d413ebfd-893d-437a-b034-72425fa40d8a\") " pod="openstack/glance-db-create-9fmgz" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.143941 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-operator-scripts\") pod \"glance-79e1-account-create-update-cjvzt\" (UID: \"c0ca4c18-464b-4f06-8dee-dae2b044e9a5\") " pod="openstack/glance-79e1-account-create-update-cjvzt" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.144322 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d413ebfd-893d-437a-b034-72425fa40d8a-operator-scripts\") pod \"glance-db-create-9fmgz\" (UID: \"d413ebfd-893d-437a-b034-72425fa40d8a\") " pod="openstack/glance-db-create-9fmgz" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.144362 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-256bb\" (UniqueName: \"kubernetes.io/projected/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-kube-api-access-256bb\") pod \"glance-79e1-account-create-update-cjvzt\" (UID: \"c0ca4c18-464b-4f06-8dee-dae2b044e9a5\") " pod="openstack/glance-79e1-account-create-update-cjvzt" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.144405 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv9gb\" (UniqueName: \"kubernetes.io/projected/d413ebfd-893d-437a-b034-72425fa40d8a-kube-api-access-qv9gb\") pod \"glance-db-create-9fmgz\" (UID: \"d413ebfd-893d-437a-b034-72425fa40d8a\") " pod="openstack/glance-db-create-9fmgz" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.145569 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-operator-scripts\") pod \"glance-79e1-account-create-update-cjvzt\" (UID: \"c0ca4c18-464b-4f06-8dee-dae2b044e9a5\") " pod="openstack/glance-79e1-account-create-update-cjvzt" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.146172 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d413ebfd-893d-437a-b034-72425fa40d8a-operator-scripts\") pod \"glance-db-create-9fmgz\" (UID: \"d413ebfd-893d-437a-b034-72425fa40d8a\") " pod="openstack/glance-db-create-9fmgz" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.167642 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv9gb\" (UniqueName: \"kubernetes.io/projected/d413ebfd-893d-437a-b034-72425fa40d8a-kube-api-access-qv9gb\") pod \"glance-db-create-9fmgz\" (UID: \"d413ebfd-893d-437a-b034-72425fa40d8a\") " pod="openstack/glance-db-create-9fmgz" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.173941 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-256bb\" (UniqueName: \"kubernetes.io/projected/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-kube-api-access-256bb\") pod \"glance-79e1-account-create-update-cjvzt\" (UID: \"c0ca4c18-464b-4f06-8dee-dae2b044e9a5\") " pod="openstack/glance-79e1-account-create-update-cjvzt" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.177437 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9fmgz" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.191699 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79e1-account-create-update-cjvzt" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.205110 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.641400 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9fmgz"] Nov 27 11:25:42 crc kubenswrapper[4807]: I1127 11:25:42.733224 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-79e1-account-create-update-cjvzt"] Nov 27 11:25:42 crc kubenswrapper[4807]: W1127 11:25:42.751417 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0ca4c18_464b_4f06_8dee_dae2b044e9a5.slice/crio-67cbaafe199df20d5c0e55a99e436953b1ae2397c023a26efce1caee57809a13 WatchSource:0}: Error finding container 67cbaafe199df20d5c0e55a99e436953b1ae2397c023a26efce1caee57809a13: Status 404 returned error can't find the container with id 67cbaafe199df20d5c0e55a99e436953b1ae2397c023a26efce1caee57809a13 Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.066466 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.216884 4807 generic.go:334] "Generic (PLEG): container finished" podID="d413ebfd-893d-437a-b034-72425fa40d8a" containerID="c3cffb3485da8458f1f54cd6ab7161b732dfd9fa27f2432b58e43018cc6e472e" exitCode=0 Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.216955 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9fmgz" event={"ID":"d413ebfd-893d-437a-b034-72425fa40d8a","Type":"ContainerDied","Data":"c3cffb3485da8458f1f54cd6ab7161b732dfd9fa27f2432b58e43018cc6e472e"} Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.216992 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9fmgz" event={"ID":"d413ebfd-893d-437a-b034-72425fa40d8a","Type":"ContainerStarted","Data":"e44a1bce2724b88bb906255ac3829f501015643c84058577682f0f919d4ca767"} Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.218685 4807 generic.go:334] "Generic (PLEG): container finished" podID="c0ca4c18-464b-4f06-8dee-dae2b044e9a5" containerID="f7d38e6347b03d6d3d12be71b78496063cdde1a25f864f1c82c2046c72edccca" exitCode=0 Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.219565 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79e1-account-create-update-cjvzt" event={"ID":"c0ca4c18-464b-4f06-8dee-dae2b044e9a5","Type":"ContainerDied","Data":"f7d38e6347b03d6d3d12be71b78496063cdde1a25f864f1c82c2046c72edccca"} Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.219590 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79e1-account-create-update-cjvzt" event={"ID":"c0ca4c18-464b-4f06-8dee-dae2b044e9a5","Type":"ContainerStarted","Data":"67cbaafe199df20d5c0e55a99e436953b1ae2397c023a26efce1caee57809a13"} Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.271298 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.358206 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-4qq8v"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.359909 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4qq8v" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.365800 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4qq8v"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.456950 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-j5bh8"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.459438 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j5bh8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.473163 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j5bh8"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.473495 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-8abf-account-create-update-kvbpv"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.474690 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.474795 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q25jq\" (UniqueName: \"kubernetes.io/projected/8ab6613c-de00-44b8-9a65-cff272bb76e8-kube-api-access-q25jq\") pod \"cinder-db-create-4qq8v\" (UID: \"8ab6613c-de00-44b8-9a65-cff272bb76e8\") " pod="openstack/cinder-db-create-4qq8v" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.474843 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ab6613c-de00-44b8-9a65-cff272bb76e8-operator-scripts\") pod \"cinder-db-create-4qq8v\" (UID: \"8ab6613c-de00-44b8-9a65-cff272bb76e8\") " pod="openstack/cinder-db-create-4qq8v" Nov 27 11:25:43 crc kubenswrapper[4807]: E1127 11:25:43.474948 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 11:25:43 crc kubenswrapper[4807]: E1127 11:25:43.474991 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 11:25:43 crc kubenswrapper[4807]: E1127 11:25:43.475045 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift podName:bc29fb6b-2886-4d51-8afd-be8fc1069ee4 nodeName:}" failed. No retries permitted until 2025-11-27 11:25:47.475028325 +0000 UTC m=+988.574526513 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift") pod "swift-storage-0" (UID: "bc29fb6b-2886-4d51-8afd-be8fc1069ee4") : configmap "swift-ring-files" not found Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.476360 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8abf-account-create-update-kvbpv" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.480964 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.486318 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.491587 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8abf-account-create-update-kvbpv"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.577503 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df1ead0-bf09-4f2f-af65-52c27e8e750e-operator-scripts\") pod \"barbican-db-create-j5bh8\" (UID: \"1df1ead0-bf09-4f2f-af65-52c27e8e750e\") " pod="openstack/barbican-db-create-j5bh8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.577572 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvkjh\" (UniqueName: \"kubernetes.io/projected/1df1ead0-bf09-4f2f-af65-52c27e8e750e-kube-api-access-zvkjh\") pod \"barbican-db-create-j5bh8\" (UID: \"1df1ead0-bf09-4f2f-af65-52c27e8e750e\") " pod="openstack/barbican-db-create-j5bh8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.577662 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-operator-scripts\") pod \"barbican-8abf-account-create-update-kvbpv\" (UID: \"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7\") " pod="openstack/barbican-8abf-account-create-update-kvbpv" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.577718 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q25jq\" (UniqueName: \"kubernetes.io/projected/8ab6613c-de00-44b8-9a65-cff272bb76e8-kube-api-access-q25jq\") pod \"cinder-db-create-4qq8v\" (UID: \"8ab6613c-de00-44b8-9a65-cff272bb76e8\") " pod="openstack/cinder-db-create-4qq8v" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.577738 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scfw2\" (UniqueName: \"kubernetes.io/projected/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-kube-api-access-scfw2\") pod \"barbican-8abf-account-create-update-kvbpv\" (UID: \"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7\") " pod="openstack/barbican-8abf-account-create-update-kvbpv" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.577782 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ab6613c-de00-44b8-9a65-cff272bb76e8-operator-scripts\") pod \"cinder-db-create-4qq8v\" (UID: \"8ab6613c-de00-44b8-9a65-cff272bb76e8\") " pod="openstack/cinder-db-create-4qq8v" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.579054 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3b55-account-create-update-q96ln"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.579957 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ab6613c-de00-44b8-9a65-cff272bb76e8-operator-scripts\") pod \"cinder-db-create-4qq8v\" (UID: \"8ab6613c-de00-44b8-9a65-cff272bb76e8\") " pod="openstack/cinder-db-create-4qq8v" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.580053 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3b55-account-create-update-q96ln" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.587239 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.595463 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3b55-account-create-update-q96ln"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.617095 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q25jq\" (UniqueName: \"kubernetes.io/projected/8ab6613c-de00-44b8-9a65-cff272bb76e8-kube-api-access-q25jq\") pod \"cinder-db-create-4qq8v\" (UID: \"8ab6613c-de00-44b8-9a65-cff272bb76e8\") " pod="openstack/cinder-db-create-4qq8v" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.678737 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scfw2\" (UniqueName: \"kubernetes.io/projected/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-kube-api-access-scfw2\") pod \"barbican-8abf-account-create-update-kvbpv\" (UID: \"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7\") " pod="openstack/barbican-8abf-account-create-update-kvbpv" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.678794 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-operator-scripts\") pod \"cinder-3b55-account-create-update-q96ln\" (UID: \"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86\") " pod="openstack/cinder-3b55-account-create-update-q96ln" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.678836 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vl82\" (UniqueName: \"kubernetes.io/projected/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-kube-api-access-9vl82\") pod \"cinder-3b55-account-create-update-q96ln\" (UID: \"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86\") " pod="openstack/cinder-3b55-account-create-update-q96ln" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.678893 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df1ead0-bf09-4f2f-af65-52c27e8e750e-operator-scripts\") pod \"barbican-db-create-j5bh8\" (UID: \"1df1ead0-bf09-4f2f-af65-52c27e8e750e\") " pod="openstack/barbican-db-create-j5bh8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.678936 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvkjh\" (UniqueName: \"kubernetes.io/projected/1df1ead0-bf09-4f2f-af65-52c27e8e750e-kube-api-access-zvkjh\") pod \"barbican-db-create-j5bh8\" (UID: \"1df1ead0-bf09-4f2f-af65-52c27e8e750e\") " pod="openstack/barbican-db-create-j5bh8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.679007 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-operator-scripts\") pod \"barbican-8abf-account-create-update-kvbpv\" (UID: \"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7\") " pod="openstack/barbican-8abf-account-create-update-kvbpv" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.680124 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df1ead0-bf09-4f2f-af65-52c27e8e750e-operator-scripts\") pod \"barbican-db-create-j5bh8\" (UID: \"1df1ead0-bf09-4f2f-af65-52c27e8e750e\") " pod="openstack/barbican-db-create-j5bh8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.680499 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-operator-scripts\") pod \"barbican-8abf-account-create-update-kvbpv\" (UID: \"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7\") " pod="openstack/barbican-8abf-account-create-update-kvbpv" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.682409 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4qq8v" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.695872 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scfw2\" (UniqueName: \"kubernetes.io/projected/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-kube-api-access-scfw2\") pod \"barbican-8abf-account-create-update-kvbpv\" (UID: \"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7\") " pod="openstack/barbican-8abf-account-create-update-kvbpv" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.696774 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvkjh\" (UniqueName: \"kubernetes.io/projected/1df1ead0-bf09-4f2f-af65-52c27e8e750e-kube-api-access-zvkjh\") pod \"barbican-db-create-j5bh8\" (UID: \"1df1ead0-bf09-4f2f-af65-52c27e8e750e\") " pod="openstack/barbican-db-create-j5bh8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.782480 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-operator-scripts\") pod \"cinder-3b55-account-create-update-q96ln\" (UID: \"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86\") " pod="openstack/cinder-3b55-account-create-update-q96ln" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.785924 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vl82\" (UniqueName: \"kubernetes.io/projected/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-kube-api-access-9vl82\") pod \"cinder-3b55-account-create-update-q96ln\" (UID: \"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86\") " pod="openstack/cinder-3b55-account-create-update-q96ln" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.785864 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j5bh8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.785668 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-operator-scripts\") pod \"cinder-3b55-account-create-update-q96ln\" (UID: \"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86\") " pod="openstack/cinder-3b55-account-create-update-q96ln" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.797717 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8abf-account-create-update-kvbpv" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.804392 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vl82\" (UniqueName: \"kubernetes.io/projected/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-kube-api-access-9vl82\") pod \"cinder-3b55-account-create-update-q96ln\" (UID: \"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86\") " pod="openstack/cinder-3b55-account-create-update-q96ln" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.804844 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fb85-account-create-update-fsqpx"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.806620 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb85-account-create-update-fsqpx" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.808065 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.819263 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fchf8"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.823697 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fchf8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.829349 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fb85-account-create-update-fsqpx"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.840208 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fchf8"] Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.856758 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.887768 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kvt5\" (UniqueName: \"kubernetes.io/projected/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-kube-api-access-8kvt5\") pod \"neutron-fb85-account-create-update-fsqpx\" (UID: \"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9\") " pod="openstack/neutron-fb85-account-create-update-fsqpx" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.887890 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-operator-scripts\") pod \"neutron-fb85-account-create-update-fsqpx\" (UID: \"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9\") " pod="openstack/neutron-fb85-account-create-update-fsqpx" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.905090 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3b55-account-create-update-q96ln" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.909402 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.989562 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kvt5\" (UniqueName: \"kubernetes.io/projected/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-kube-api-access-8kvt5\") pod \"neutron-fb85-account-create-update-fsqpx\" (UID: \"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9\") " pod="openstack/neutron-fb85-account-create-update-fsqpx" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.989633 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e6fdb88-9641-4e76-9418-7deae0ca555a-operator-scripts\") pod \"neutron-db-create-fchf8\" (UID: \"7e6fdb88-9641-4e76-9418-7deae0ca555a\") " pod="openstack/neutron-db-create-fchf8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.989657 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lxvm\" (UniqueName: \"kubernetes.io/projected/7e6fdb88-9641-4e76-9418-7deae0ca555a-kube-api-access-9lxvm\") pod \"neutron-db-create-fchf8\" (UID: \"7e6fdb88-9641-4e76-9418-7deae0ca555a\") " pod="openstack/neutron-db-create-fchf8" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.989718 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-operator-scripts\") pod \"neutron-fb85-account-create-update-fsqpx\" (UID: \"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9\") " pod="openstack/neutron-fb85-account-create-update-fsqpx" Nov 27 11:25:43 crc kubenswrapper[4807]: I1127 11:25:43.990550 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-operator-scripts\") pod \"neutron-fb85-account-create-update-fsqpx\" (UID: \"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9\") " pod="openstack/neutron-fb85-account-create-update-fsqpx" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.005011 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kvt5\" (UniqueName: \"kubernetes.io/projected/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-kube-api-access-8kvt5\") pod \"neutron-fb85-account-create-update-fsqpx\" (UID: \"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9\") " pod="openstack/neutron-fb85-account-create-update-fsqpx" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.090997 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e6fdb88-9641-4e76-9418-7deae0ca555a-operator-scripts\") pod \"neutron-db-create-fchf8\" (UID: \"7e6fdb88-9641-4e76-9418-7deae0ca555a\") " pod="openstack/neutron-db-create-fchf8" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.091051 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lxvm\" (UniqueName: \"kubernetes.io/projected/7e6fdb88-9641-4e76-9418-7deae0ca555a-kube-api-access-9lxvm\") pod \"neutron-db-create-fchf8\" (UID: \"7e6fdb88-9641-4e76-9418-7deae0ca555a\") " pod="openstack/neutron-db-create-fchf8" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.091747 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e6fdb88-9641-4e76-9418-7deae0ca555a-operator-scripts\") pod \"neutron-db-create-fchf8\" (UID: \"7e6fdb88-9641-4e76-9418-7deae0ca555a\") " pod="openstack/neutron-db-create-fchf8" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.092168 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.093719 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.096278 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hh9hx" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.096450 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.096600 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.096777 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.111635 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.113858 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lxvm\" (UniqueName: \"kubernetes.io/projected/7e6fdb88-9641-4e76-9418-7deae0ca555a-kube-api-access-9lxvm\") pod \"neutron-db-create-fchf8\" (UID: \"7e6fdb88-9641-4e76-9418-7deae0ca555a\") " pod="openstack/neutron-db-create-fchf8" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.164075 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb85-account-create-update-fsqpx" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.168384 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fchf8" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.192582 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2270de6-a69c-44be-8fb7-98e10027cd34-config\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.192705 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f2270de6-a69c-44be-8fb7-98e10027cd34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.192732 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2270de6-a69c-44be-8fb7-98e10027cd34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.192749 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ff9b\" (UniqueName: \"kubernetes.io/projected/f2270de6-a69c-44be-8fb7-98e10027cd34-kube-api-access-9ff9b\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.192770 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2270de6-a69c-44be-8fb7-98e10027cd34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.192870 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2270de6-a69c-44be-8fb7-98e10027cd34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.192913 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2270de6-a69c-44be-8fb7-98e10027cd34-scripts\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.294301 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f2270de6-a69c-44be-8fb7-98e10027cd34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.294349 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2270de6-a69c-44be-8fb7-98e10027cd34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.294367 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ff9b\" (UniqueName: \"kubernetes.io/projected/f2270de6-a69c-44be-8fb7-98e10027cd34-kube-api-access-9ff9b\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.294397 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2270de6-a69c-44be-8fb7-98e10027cd34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.294477 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2270de6-a69c-44be-8fb7-98e10027cd34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.294517 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2270de6-a69c-44be-8fb7-98e10027cd34-scripts\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.294541 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2270de6-a69c-44be-8fb7-98e10027cd34-config\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.295830 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f2270de6-a69c-44be-8fb7-98e10027cd34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.298696 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2270de6-a69c-44be-8fb7-98e10027cd34-scripts\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.298782 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2270de6-a69c-44be-8fb7-98e10027cd34-config\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.300850 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2270de6-a69c-44be-8fb7-98e10027cd34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.301442 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2270de6-a69c-44be-8fb7-98e10027cd34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.301498 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2270de6-a69c-44be-8fb7-98e10027cd34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.311918 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ff9b\" (UniqueName: \"kubernetes.io/projected/f2270de6-a69c-44be-8fb7-98e10027cd34-kube-api-access-9ff9b\") pod \"ovn-northd-0\" (UID: \"f2270de6-a69c-44be-8fb7-98e10027cd34\") " pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.415216 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 27 11:25:44 crc kubenswrapper[4807]: I1127 11:25:44.666424 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:45 crc kubenswrapper[4807]: I1127 11:25:45.578838 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9gjp"] Nov 27 11:25:45 crc kubenswrapper[4807]: I1127 11:25:45.581431 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w9gjp" podUID="72480ac5-37a1-414b-b364-4290d9525ddb" containerName="registry-server" containerID="cri-o://1c89d26302abeea6f5c01f3f59b2b283336e6db0bea1e8fce958bc01cecd9025" gracePeriod=2 Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.016161 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79e1-account-create-update-cjvzt" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.038840 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9fmgz" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.132834 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-256bb\" (UniqueName: \"kubernetes.io/projected/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-kube-api-access-256bb\") pod \"c0ca4c18-464b-4f06-8dee-dae2b044e9a5\" (UID: \"c0ca4c18-464b-4f06-8dee-dae2b044e9a5\") " Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.132920 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv9gb\" (UniqueName: \"kubernetes.io/projected/d413ebfd-893d-437a-b034-72425fa40d8a-kube-api-access-qv9gb\") pod \"d413ebfd-893d-437a-b034-72425fa40d8a\" (UID: \"d413ebfd-893d-437a-b034-72425fa40d8a\") " Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.133015 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-operator-scripts\") pod \"c0ca4c18-464b-4f06-8dee-dae2b044e9a5\" (UID: \"c0ca4c18-464b-4f06-8dee-dae2b044e9a5\") " Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.133043 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d413ebfd-893d-437a-b034-72425fa40d8a-operator-scripts\") pod \"d413ebfd-893d-437a-b034-72425fa40d8a\" (UID: \"d413ebfd-893d-437a-b034-72425fa40d8a\") " Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.134367 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c0ca4c18-464b-4f06-8dee-dae2b044e9a5" (UID: "c0ca4c18-464b-4f06-8dee-dae2b044e9a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.134529 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d413ebfd-893d-437a-b034-72425fa40d8a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d413ebfd-893d-437a-b034-72425fa40d8a" (UID: "d413ebfd-893d-437a-b034-72425fa40d8a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.141471 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d413ebfd-893d-437a-b034-72425fa40d8a-kube-api-access-qv9gb" (OuterVolumeSpecName: "kube-api-access-qv9gb") pod "d413ebfd-893d-437a-b034-72425fa40d8a" (UID: "d413ebfd-893d-437a-b034-72425fa40d8a"). InnerVolumeSpecName "kube-api-access-qv9gb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.142842 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-kube-api-access-256bb" (OuterVolumeSpecName: "kube-api-access-256bb") pod "c0ca4c18-464b-4f06-8dee-dae2b044e9a5" (UID: "c0ca4c18-464b-4f06-8dee-dae2b044e9a5"). InnerVolumeSpecName "kube-api-access-256bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.240959 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.241227 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d413ebfd-893d-437a-b034-72425fa40d8a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.241238 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-256bb\" (UniqueName: \"kubernetes.io/projected/c0ca4c18-464b-4f06-8dee-dae2b044e9a5-kube-api-access-256bb\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.241259 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv9gb\" (UniqueName: \"kubernetes.io/projected/d413ebfd-893d-437a-b034-72425fa40d8a-kube-api-access-qv9gb\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.257963 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6hg29" event={"ID":"b83dff2f-801e-4a9b-9427-48e1f51bcc79","Type":"ContainerStarted","Data":"d9593d8653907de4a777ae01853d668479e418c78b904a234f96c6aa9fffe297"} Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.264906 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9fmgz" event={"ID":"d413ebfd-893d-437a-b034-72425fa40d8a","Type":"ContainerDied","Data":"e44a1bce2724b88bb906255ac3829f501015643c84058577682f0f919d4ca767"} Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.264947 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9fmgz" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.264950 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e44a1bce2724b88bb906255ac3829f501015643c84058577682f0f919d4ca767" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.276457 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-79e1-account-create-update-cjvzt" event={"ID":"c0ca4c18-464b-4f06-8dee-dae2b044e9a5","Type":"ContainerDied","Data":"67cbaafe199df20d5c0e55a99e436953b1ae2397c023a26efce1caee57809a13"} Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.276503 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67cbaafe199df20d5c0e55a99e436953b1ae2397c023a26efce1caee57809a13" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.276610 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-79e1-account-create-update-cjvzt" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.278819 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.288785 4807 generic.go:334] "Generic (PLEG): container finished" podID="72480ac5-37a1-414b-b364-4290d9525ddb" containerID="1c89d26302abeea6f5c01f3f59b2b283336e6db0bea1e8fce958bc01cecd9025" exitCode=0 Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.288833 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9gjp" event={"ID":"72480ac5-37a1-414b-b364-4290d9525ddb","Type":"ContainerDied","Data":"1c89d26302abeea6f5c01f3f59b2b283336e6db0bea1e8fce958bc01cecd9025"} Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.295970 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-c8cvq"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.296039 4807 scope.go:117] "RemoveContainer" containerID="1c89d26302abeea6f5c01f3f59b2b283336e6db0bea1e8fce958bc01cecd9025" Nov 27 11:25:46 crc kubenswrapper[4807]: E1127 11:25:46.296985 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72480ac5-37a1-414b-b364-4290d9525ddb" containerName="extract-utilities" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.297003 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="72480ac5-37a1-414b-b364-4290d9525ddb" containerName="extract-utilities" Nov 27 11:25:46 crc kubenswrapper[4807]: E1127 11:25:46.297029 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ca4c18-464b-4f06-8dee-dae2b044e9a5" containerName="mariadb-account-create-update" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.297036 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ca4c18-464b-4f06-8dee-dae2b044e9a5" containerName="mariadb-account-create-update" Nov 27 11:25:46 crc kubenswrapper[4807]: E1127 11:25:46.297056 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d413ebfd-893d-437a-b034-72425fa40d8a" containerName="mariadb-database-create" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.297063 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d413ebfd-893d-437a-b034-72425fa40d8a" containerName="mariadb-database-create" Nov 27 11:25:46 crc kubenswrapper[4807]: E1127 11:25:46.297090 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72480ac5-37a1-414b-b364-4290d9525ddb" containerName="extract-content" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.297096 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="72480ac5-37a1-414b-b364-4290d9525ddb" containerName="extract-content" Nov 27 11:25:46 crc kubenswrapper[4807]: E1127 11:25:46.297108 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72480ac5-37a1-414b-b364-4290d9525ddb" containerName="registry-server" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.297114 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="72480ac5-37a1-414b-b364-4290d9525ddb" containerName="registry-server" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.297403 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d413ebfd-893d-437a-b034-72425fa40d8a" containerName="mariadb-database-create" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.297425 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="72480ac5-37a1-414b-b364-4290d9525ddb" containerName="registry-server" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.297445 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ca4c18-464b-4f06-8dee-dae2b044e9a5" containerName="mariadb-account-create-update" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.299594 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c8cvq" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.313809 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8534-account-create-update-q2bbw"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.314955 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8534-account-create-update-q2bbw" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.320425 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8534-account-create-update-q2bbw"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.321741 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.336605 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-c8cvq"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.341496 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-6hg29" podStartSLOduration=1.3823641979999999 podStartE2EDuration="6.341474381s" podCreationTimestamp="2025-11-27 11:25:40 +0000 UTC" firstStartedPulling="2025-11-27 11:25:40.907853113 +0000 UTC m=+982.007351301" lastFinishedPulling="2025-11-27 11:25:45.866963286 +0000 UTC m=+986.966461484" observedRunningTime="2025-11-27 11:25:46.291593772 +0000 UTC m=+987.391091970" watchObservedRunningTime="2025-11-27 11:25:46.341474381 +0000 UTC m=+987.440972579" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.351298 4807 scope.go:117] "RemoveContainer" containerID="9e873195553453996426f105a171778d52c29988659cb9dc3526be8575f97ff7" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.389404 4807 scope.go:117] "RemoveContainer" containerID="8f5660994310a44a62c5df25530684460af218339f36a9e897959d3e495db9d9" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.449286 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k6mz\" (UniqueName: \"kubernetes.io/projected/72480ac5-37a1-414b-b364-4290d9525ddb-kube-api-access-5k6mz\") pod \"72480ac5-37a1-414b-b364-4290d9525ddb\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.449353 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-catalog-content\") pod \"72480ac5-37a1-414b-b364-4290d9525ddb\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.449398 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-utilities\") pod \"72480ac5-37a1-414b-b364-4290d9525ddb\" (UID: \"72480ac5-37a1-414b-b364-4290d9525ddb\") " Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.449596 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4wb\" (UniqueName: \"kubernetes.io/projected/b61fd3b7-c996-4009-8e5a-208bba68066f-kube-api-access-8k4wb\") pod \"keystone-db-create-c8cvq\" (UID: \"b61fd3b7-c996-4009-8e5a-208bba68066f\") " pod="openstack/keystone-db-create-c8cvq" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.449663 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00b9f61b-2731-4391-9be9-56437aa380e7-operator-scripts\") pod \"keystone-8534-account-create-update-q2bbw\" (UID: \"00b9f61b-2731-4391-9be9-56437aa380e7\") " pod="openstack/keystone-8534-account-create-update-q2bbw" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.449716 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61fd3b7-c996-4009-8e5a-208bba68066f-operator-scripts\") pod \"keystone-db-create-c8cvq\" (UID: \"b61fd3b7-c996-4009-8e5a-208bba68066f\") " pod="openstack/keystone-db-create-c8cvq" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.449734 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkp4g\" (UniqueName: \"kubernetes.io/projected/00b9f61b-2731-4391-9be9-56437aa380e7-kube-api-access-fkp4g\") pod \"keystone-8534-account-create-update-q2bbw\" (UID: \"00b9f61b-2731-4391-9be9-56437aa380e7\") " pod="openstack/keystone-8534-account-create-update-q2bbw" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.451579 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-utilities" (OuterVolumeSpecName: "utilities") pod "72480ac5-37a1-414b-b364-4290d9525ddb" (UID: "72480ac5-37a1-414b-b364-4290d9525ddb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.454177 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72480ac5-37a1-414b-b364-4290d9525ddb-kube-api-access-5k6mz" (OuterVolumeSpecName: "kube-api-access-5k6mz") pod "72480ac5-37a1-414b-b364-4290d9525ddb" (UID: "72480ac5-37a1-414b-b364-4290d9525ddb"). InnerVolumeSpecName "kube-api-access-5k6mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.460806 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-j5bh8"] Nov 27 11:25:46 crc kubenswrapper[4807]: W1127 11:25:46.469473 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1df1ead0_bf09_4f2f_af65_52c27e8e750e.slice/crio-68cf779f6ddeca4bbfb5f89fecb5de802c45c0b8038787999c644ba6e6420cb3 WatchSource:0}: Error finding container 68cf779f6ddeca4bbfb5f89fecb5de802c45c0b8038787999c644ba6e6420cb3: Status 404 returned error can't find the container with id 68cf779f6ddeca4bbfb5f89fecb5de802c45c0b8038787999c644ba6e6420cb3 Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.477011 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fchf8"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.489752 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72480ac5-37a1-414b-b364-4290d9525ddb" (UID: "72480ac5-37a1-414b-b364-4290d9525ddb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.552938 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4wb\" (UniqueName: \"kubernetes.io/projected/b61fd3b7-c996-4009-8e5a-208bba68066f-kube-api-access-8k4wb\") pod \"keystone-db-create-c8cvq\" (UID: \"b61fd3b7-c996-4009-8e5a-208bba68066f\") " pod="openstack/keystone-db-create-c8cvq" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.553011 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00b9f61b-2731-4391-9be9-56437aa380e7-operator-scripts\") pod \"keystone-8534-account-create-update-q2bbw\" (UID: \"00b9f61b-2731-4391-9be9-56437aa380e7\") " pod="openstack/keystone-8534-account-create-update-q2bbw" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.553061 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61fd3b7-c996-4009-8e5a-208bba68066f-operator-scripts\") pod \"keystone-db-create-c8cvq\" (UID: \"b61fd3b7-c996-4009-8e5a-208bba68066f\") " pod="openstack/keystone-db-create-c8cvq" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.553082 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkp4g\" (UniqueName: \"kubernetes.io/projected/00b9f61b-2731-4391-9be9-56437aa380e7-kube-api-access-fkp4g\") pod \"keystone-8534-account-create-update-q2bbw\" (UID: \"00b9f61b-2731-4391-9be9-56437aa380e7\") " pod="openstack/keystone-8534-account-create-update-q2bbw" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.553152 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k6mz\" (UniqueName: \"kubernetes.io/projected/72480ac5-37a1-414b-b364-4290d9525ddb-kube-api-access-5k6mz\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.553163 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.553179 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72480ac5-37a1-414b-b364-4290d9525ddb-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.556540 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00b9f61b-2731-4391-9be9-56437aa380e7-operator-scripts\") pod \"keystone-8534-account-create-update-q2bbw\" (UID: \"00b9f61b-2731-4391-9be9-56437aa380e7\") " pod="openstack/keystone-8534-account-create-update-q2bbw" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.557379 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61fd3b7-c996-4009-8e5a-208bba68066f-operator-scripts\") pod \"keystone-db-create-c8cvq\" (UID: \"b61fd3b7-c996-4009-8e5a-208bba68066f\") " pod="openstack/keystone-db-create-c8cvq" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.575672 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkp4g\" (UniqueName: \"kubernetes.io/projected/00b9f61b-2731-4391-9be9-56437aa380e7-kube-api-access-fkp4g\") pod \"keystone-8534-account-create-update-q2bbw\" (UID: \"00b9f61b-2731-4391-9be9-56437aa380e7\") " pod="openstack/keystone-8534-account-create-update-q2bbw" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.579852 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4wb\" (UniqueName: \"kubernetes.io/projected/b61fd3b7-c996-4009-8e5a-208bba68066f-kube-api-access-8k4wb\") pod \"keystone-db-create-c8cvq\" (UID: \"b61fd3b7-c996-4009-8e5a-208bba68066f\") " pod="openstack/keystone-db-create-c8cvq" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.616861 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-6vqr9"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.622545 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6vqr9" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.623751 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-b10b-account-create-update-9ck9q"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.625202 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b10b-account-create-update-9ck9q" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.627409 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.631163 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6vqr9"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.647700 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c8cvq" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.661701 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b10b-account-create-update-9ck9q"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.669081 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-8abf-account-create-update-kvbpv"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.670138 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8534-account-create-update-q2bbw" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.676296 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3b55-account-create-update-q96ln"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.716158 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.768127 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh8k4\" (UniqueName: \"kubernetes.io/projected/d67782cf-87a3-44fe-88c8-16d66f224029-kube-api-access-sh8k4\") pod \"placement-b10b-account-create-update-9ck9q\" (UID: \"d67782cf-87a3-44fe-88c8-16d66f224029\") " pod="openstack/placement-b10b-account-create-update-9ck9q" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.768185 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d67782cf-87a3-44fe-88c8-16d66f224029-operator-scripts\") pod \"placement-b10b-account-create-update-9ck9q\" (UID: \"d67782cf-87a3-44fe-88c8-16d66f224029\") " pod="openstack/placement-b10b-account-create-update-9ck9q" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.768239 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a128226-2a88-4b86-8592-52eb97146227-operator-scripts\") pod \"placement-db-create-6vqr9\" (UID: \"4a128226-2a88-4b86-8592-52eb97146227\") " pod="openstack/placement-db-create-6vqr9" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.768279 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prjxj\" (UniqueName: \"kubernetes.io/projected/4a128226-2a88-4b86-8592-52eb97146227-kube-api-access-prjxj\") pod \"placement-db-create-6vqr9\" (UID: \"4a128226-2a88-4b86-8592-52eb97146227\") " pod="openstack/placement-db-create-6vqr9" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.870589 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh8k4\" (UniqueName: \"kubernetes.io/projected/d67782cf-87a3-44fe-88c8-16d66f224029-kube-api-access-sh8k4\") pod \"placement-b10b-account-create-update-9ck9q\" (UID: \"d67782cf-87a3-44fe-88c8-16d66f224029\") " pod="openstack/placement-b10b-account-create-update-9ck9q" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.870876 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d67782cf-87a3-44fe-88c8-16d66f224029-operator-scripts\") pod \"placement-b10b-account-create-update-9ck9q\" (UID: \"d67782cf-87a3-44fe-88c8-16d66f224029\") " pod="openstack/placement-b10b-account-create-update-9ck9q" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.870932 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a128226-2a88-4b86-8592-52eb97146227-operator-scripts\") pod \"placement-db-create-6vqr9\" (UID: \"4a128226-2a88-4b86-8592-52eb97146227\") " pod="openstack/placement-db-create-6vqr9" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.870961 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prjxj\" (UniqueName: \"kubernetes.io/projected/4a128226-2a88-4b86-8592-52eb97146227-kube-api-access-prjxj\") pod \"placement-db-create-6vqr9\" (UID: \"4a128226-2a88-4b86-8592-52eb97146227\") " pod="openstack/placement-db-create-6vqr9" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.871992 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d67782cf-87a3-44fe-88c8-16d66f224029-operator-scripts\") pod \"placement-b10b-account-create-update-9ck9q\" (UID: \"d67782cf-87a3-44fe-88c8-16d66f224029\") " pod="openstack/placement-b10b-account-create-update-9ck9q" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.872014 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a128226-2a88-4b86-8592-52eb97146227-operator-scripts\") pod \"placement-db-create-6vqr9\" (UID: \"4a128226-2a88-4b86-8592-52eb97146227\") " pod="openstack/placement-db-create-6vqr9" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.887451 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prjxj\" (UniqueName: \"kubernetes.io/projected/4a128226-2a88-4b86-8592-52eb97146227-kube-api-access-prjxj\") pod \"placement-db-create-6vqr9\" (UID: \"4a128226-2a88-4b86-8592-52eb97146227\") " pod="openstack/placement-db-create-6vqr9" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.893137 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh8k4\" (UniqueName: \"kubernetes.io/projected/d67782cf-87a3-44fe-88c8-16d66f224029-kube-api-access-sh8k4\") pod \"placement-b10b-account-create-update-9ck9q\" (UID: \"d67782cf-87a3-44fe-88c8-16d66f224029\") " pod="openstack/placement-b10b-account-create-update-9ck9q" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.966875 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b10b-account-create-update-9ck9q" Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.979533 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4qq8v"] Nov 27 11:25:46 crc kubenswrapper[4807]: I1127 11:25:46.995806 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fb85-account-create-update-fsqpx"] Nov 27 11:25:46 crc kubenswrapper[4807]: W1127 11:25:46.999096 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ab6613c_de00_44b8_9a65_cff272bb76e8.slice/crio-ba11fc6b054cb048b1ebd51318d53447e83dbf3fdb9fd69037be9ac9cf0cc2ef WatchSource:0}: Error finding container ba11fc6b054cb048b1ebd51318d53447e83dbf3fdb9fd69037be9ac9cf0cc2ef: Status 404 returned error can't find the container with id ba11fc6b054cb048b1ebd51318d53447e83dbf3fdb9fd69037be9ac9cf0cc2ef Nov 27 11:25:47 crc kubenswrapper[4807]: W1127 11:25:47.003848 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6d2a7ab_579e_4690_8067_7ba5c08cf3c9.slice/crio-5beb960856e2709a13b6e5c89f2841e7ff00dddc86d0e94b2774fc7896ca0b91 WatchSource:0}: Error finding container 5beb960856e2709a13b6e5c89f2841e7ff00dddc86d0e94b2774fc7896ca0b91: Status 404 returned error can't find the container with id 5beb960856e2709a13b6e5c89f2841e7ff00dddc86d0e94b2774fc7896ca0b91 Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.038560 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6vqr9" Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.206650 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-c8cvq"] Nov 27 11:25:47 crc kubenswrapper[4807]: W1127 11:25:47.226483 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb61fd3b7_c996_4009_8e5a_208bba68066f.slice/crio-0e0186d7a9a435f344ec4a44c1fc648cf20bb97b843293d9c48b536b372d87a8 WatchSource:0}: Error finding container 0e0186d7a9a435f344ec4a44c1fc648cf20bb97b843293d9c48b536b372d87a8: Status 404 returned error can't find the container with id 0e0186d7a9a435f344ec4a44c1fc648cf20bb97b843293d9c48b536b372d87a8 Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.325450 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c8cvq" event={"ID":"b61fd3b7-c996-4009-8e5a-208bba68066f","Type":"ContainerStarted","Data":"0e0186d7a9a435f344ec4a44c1fc648cf20bb97b843293d9c48b536b372d87a8"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.329869 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb85-account-create-update-fsqpx" event={"ID":"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9","Type":"ContainerStarted","Data":"5beb960856e2709a13b6e5c89f2841e7ff00dddc86d0e94b2774fc7896ca0b91"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.335972 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f2270de6-a69c-44be-8fb7-98e10027cd34","Type":"ContainerStarted","Data":"467dc6dec2e31140b48316030e26cab1160fb9857ca381e403f1f2302f8f8597"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.341237 4807 generic.go:334] "Generic (PLEG): container finished" podID="1df1ead0-bf09-4f2f-af65-52c27e8e750e" containerID="c62dee7eb94cb18333febe98af1b8fa218551d8283dfd41d1d1dd1a2438e8c2b" exitCode=0 Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.341435 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j5bh8" event={"ID":"1df1ead0-bf09-4f2f-af65-52c27e8e750e","Type":"ContainerDied","Data":"c62dee7eb94cb18333febe98af1b8fa218551d8283dfd41d1d1dd1a2438e8c2b"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.341465 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j5bh8" event={"ID":"1df1ead0-bf09-4f2f-af65-52c27e8e750e","Type":"ContainerStarted","Data":"68cf779f6ddeca4bbfb5f89fecb5de802c45c0b8038787999c644ba6e6420cb3"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.354600 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8534-account-create-update-q2bbw"] Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.365290 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fb85-account-create-update-fsqpx" podStartSLOduration=4.365267829 podStartE2EDuration="4.365267829s" podCreationTimestamp="2025-11-27 11:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:25:47.346407061 +0000 UTC m=+988.445905269" watchObservedRunningTime="2025-11-27 11:25:47.365267829 +0000 UTC m=+988.464766027" Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.367567 4807 generic.go:334] "Generic (PLEG): container finished" podID="7e6fdb88-9641-4e76-9418-7deae0ca555a" containerID="e9e442e566d906d46e1651604508bb54f67f9ec205754af36d294435cd4d572c" exitCode=0 Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.367682 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fchf8" event={"ID":"7e6fdb88-9641-4e76-9418-7deae0ca555a","Type":"ContainerDied","Data":"e9e442e566d906d46e1651604508bb54f67f9ec205754af36d294435cd4d572c"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.367710 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fchf8" event={"ID":"7e6fdb88-9641-4e76-9418-7deae0ca555a","Type":"ContainerStarted","Data":"3a329cfe5dd3f4666f831d746951038e50afe58bdfeaa183826449f0fa9d07e4"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.373808 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8abf-account-create-update-kvbpv" event={"ID":"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7","Type":"ContainerStarted","Data":"8483b45075dea8b3fed4078688781cfe06ebecd7ed015982417699173049577b"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.373850 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8abf-account-create-update-kvbpv" event={"ID":"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7","Type":"ContainerStarted","Data":"7fad5799f0cceb18eb0c246325f2c21b384a4b78f0f8bf74805db2506e7ca1cf"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.389574 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9gjp" event={"ID":"72480ac5-37a1-414b-b364-4290d9525ddb","Type":"ContainerDied","Data":"3182558dd1f08f59e878f3f3358c90ac8eb03b9b08ccf0d45d593e641ef8fde9"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.389716 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9gjp" Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.392679 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3b55-account-create-update-q96ln" event={"ID":"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86","Type":"ContainerStarted","Data":"fcd19fd8d50d0f57e1288a3a5bd8cff57feabd718395e7c3f92c2267c55b836a"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.392718 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3b55-account-create-update-q96ln" event={"ID":"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86","Type":"ContainerStarted","Data":"64ae97d9101726e3c90b066ce2fefb11616f336af0b450065c3622f082b56e3b"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.399155 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4qq8v" event={"ID":"8ab6613c-de00-44b8-9a65-cff272bb76e8","Type":"ContainerStarted","Data":"ba11fc6b054cb048b1ebd51318d53447e83dbf3fdb9fd69037be9ac9cf0cc2ef"} Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.428503 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-8abf-account-create-update-kvbpv" podStartSLOduration=4.428480411 podStartE2EDuration="4.428480411s" podCreationTimestamp="2025-11-27 11:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:25:47.397378288 +0000 UTC m=+988.496876486" watchObservedRunningTime="2025-11-27 11:25:47.428480411 +0000 UTC m=+988.527978609" Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.455370 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-3b55-account-create-update-q96ln" podStartSLOduration=4.455352221 podStartE2EDuration="4.455352221s" podCreationTimestamp="2025-11-27 11:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:25:47.415526478 +0000 UTC m=+988.515024676" watchObservedRunningTime="2025-11-27 11:25:47.455352221 +0000 UTC m=+988.554850419" Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.472610 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9gjp"] Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.491781 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:47 crc kubenswrapper[4807]: E1127 11:25:47.491912 4807 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 27 11:25:47 crc kubenswrapper[4807]: E1127 11:25:47.491956 4807 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 27 11:25:47 crc kubenswrapper[4807]: E1127 11:25:47.492031 4807 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift podName:bc29fb6b-2886-4d51-8afd-be8fc1069ee4 nodeName:}" failed. No retries permitted until 2025-11-27 11:25:55.49200865 +0000 UTC m=+996.591506858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift") pod "swift-storage-0" (UID: "bc29fb6b-2886-4d51-8afd-be8fc1069ee4") : configmap "swift-ring-files" not found Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.497693 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9gjp"] Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.505742 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-b10b-account-create-update-9ck9q"] Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.546017 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72480ac5-37a1-414b-b364-4290d9525ddb" path="/var/lib/kubelet/pods/72480ac5-37a1-414b-b364-4290d9525ddb/volumes" Nov 27 11:25:47 crc kubenswrapper[4807]: I1127 11:25:47.670154 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-6vqr9"] Nov 27 11:25:47 crc kubenswrapper[4807]: W1127 11:25:47.761735 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a128226_2a88_4b86_8592_52eb97146227.slice/crio-4646133b714be6cba396cbecf855874a272feaf1461d559fda58dab16a969e2f WatchSource:0}: Error finding container 4646133b714be6cba396cbecf855874a272feaf1461d559fda58dab16a969e2f: Status 404 returned error can't find the container with id 4646133b714be6cba396cbecf855874a272feaf1461d559fda58dab16a969e2f Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.313269 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.412846 4807 generic.go:334] "Generic (PLEG): container finished" podID="00b9f61b-2731-4391-9be9-56437aa380e7" containerID="bf9a3d8346831c0e092d078418588c9cb7623d129e78a924f34c5a64553c101a" exitCode=0 Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.413214 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8534-account-create-update-q2bbw" event={"ID":"00b9f61b-2731-4391-9be9-56437aa380e7","Type":"ContainerDied","Data":"bf9a3d8346831c0e092d078418588c9cb7623d129e78a924f34c5a64553c101a"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.413237 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8534-account-create-update-q2bbw" event={"ID":"00b9f61b-2731-4391-9be9-56437aa380e7","Type":"ContainerStarted","Data":"d5cfb83aab4c7af98501991fb9bf96df42058885710bd6e1c7e94a9c4257b08e"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.420642 4807 generic.go:334] "Generic (PLEG): container finished" podID="cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86" containerID="fcd19fd8d50d0f57e1288a3a5bd8cff57feabd718395e7c3f92c2267c55b836a" exitCode=0 Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.420747 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3b55-account-create-update-q96ln" event={"ID":"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86","Type":"ContainerDied","Data":"fcd19fd8d50d0f57e1288a3a5bd8cff57feabd718395e7c3f92c2267c55b836a"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.424898 4807 generic.go:334] "Generic (PLEG): container finished" podID="0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7" containerID="8483b45075dea8b3fed4078688781cfe06ebecd7ed015982417699173049577b" exitCode=0 Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.424954 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8abf-account-create-update-kvbpv" event={"ID":"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7","Type":"ContainerDied","Data":"8483b45075dea8b3fed4078688781cfe06ebecd7ed015982417699173049577b"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.426952 4807 generic.go:334] "Generic (PLEG): container finished" podID="e6d2a7ab-579e-4690-8067-7ba5c08cf3c9" containerID="a7027967779e3bb30645768cc67c2c1b522d99b74dad797fb4d4eca42d78889e" exitCode=0 Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.427037 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb85-account-create-update-fsqpx" event={"ID":"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9","Type":"ContainerDied","Data":"a7027967779e3bb30645768cc67c2c1b522d99b74dad797fb4d4eca42d78889e"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.430076 4807 generic.go:334] "Generic (PLEG): container finished" podID="b61fd3b7-c996-4009-8e5a-208bba68066f" containerID="0c33d586f8adc7e401025375dad3b043e4e8c7b55df0ceb3ec92cba470dba98f" exitCode=0 Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.430142 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c8cvq" event={"ID":"b61fd3b7-c996-4009-8e5a-208bba68066f","Type":"ContainerDied","Data":"0c33d586f8adc7e401025375dad3b043e4e8c7b55df0ceb3ec92cba470dba98f"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.440848 4807 generic.go:334] "Generic (PLEG): container finished" podID="8ab6613c-de00-44b8-9a65-cff272bb76e8" containerID="5e24e77cbbc21ed639c4dca77f9b0679b58e1f1a86059d1b2b33af534e10122e" exitCode=0 Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.440914 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4qq8v" event={"ID":"8ab6613c-de00-44b8-9a65-cff272bb76e8","Type":"ContainerDied","Data":"5e24e77cbbc21ed639c4dca77f9b0679b58e1f1a86059d1b2b33af534e10122e"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.442606 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6vqr9" event={"ID":"4a128226-2a88-4b86-8592-52eb97146227","Type":"ContainerStarted","Data":"f964dd0eb744c72975313463cc7ca3d009eed12263d794eeeeae83172627b01a"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.442643 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6vqr9" event={"ID":"4a128226-2a88-4b86-8592-52eb97146227","Type":"ContainerStarted","Data":"4646133b714be6cba396cbecf855874a272feaf1461d559fda58dab16a969e2f"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.462655 4807 generic.go:334] "Generic (PLEG): container finished" podID="d67782cf-87a3-44fe-88c8-16d66f224029" containerID="ae61011a72922ed1bbce2fb5e97b93dfe335d75192538aecb2d89ddee6d09b84" exitCode=0 Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.462929 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b10b-account-create-update-9ck9q" event={"ID":"d67782cf-87a3-44fe-88c8-16d66f224029","Type":"ContainerDied","Data":"ae61011a72922ed1bbce2fb5e97b93dfe335d75192538aecb2d89ddee6d09b84"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.463018 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b10b-account-create-update-9ck9q" event={"ID":"d67782cf-87a3-44fe-88c8-16d66f224029","Type":"ContainerStarted","Data":"2877535684b5f2e71ce218847fdf9874a8fc7514f1a841332f31939214c40229"} Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.554623 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-6vqr9" podStartSLOduration=2.554603004 podStartE2EDuration="2.554603004s" podCreationTimestamp="2025-11-27 11:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:25:48.537316067 +0000 UTC m=+989.636814265" watchObservedRunningTime="2025-11-27 11:25:48.554603004 +0000 UTC m=+989.654101202" Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.754908 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.831774 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlflh"] Nov 27 11:25:48 crc kubenswrapper[4807]: I1127 11:25:48.832038 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" podUID="6ecc155f-d98b-4a43-bf52-c58e0bf367f2" containerName="dnsmasq-dns" containerID="cri-o://71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c" gracePeriod=10 Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.044833 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fchf8" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.053483 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j5bh8" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.224601 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvkjh\" (UniqueName: \"kubernetes.io/projected/1df1ead0-bf09-4f2f-af65-52c27e8e750e-kube-api-access-zvkjh\") pod \"1df1ead0-bf09-4f2f-af65-52c27e8e750e\" (UID: \"1df1ead0-bf09-4f2f-af65-52c27e8e750e\") " Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.224638 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df1ead0-bf09-4f2f-af65-52c27e8e750e-operator-scripts\") pod \"1df1ead0-bf09-4f2f-af65-52c27e8e750e\" (UID: \"1df1ead0-bf09-4f2f-af65-52c27e8e750e\") " Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.224809 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lxvm\" (UniqueName: \"kubernetes.io/projected/7e6fdb88-9641-4e76-9418-7deae0ca555a-kube-api-access-9lxvm\") pod \"7e6fdb88-9641-4e76-9418-7deae0ca555a\" (UID: \"7e6fdb88-9641-4e76-9418-7deae0ca555a\") " Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.224884 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e6fdb88-9641-4e76-9418-7deae0ca555a-operator-scripts\") pod \"7e6fdb88-9641-4e76-9418-7deae0ca555a\" (UID: \"7e6fdb88-9641-4e76-9418-7deae0ca555a\") " Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.225832 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1df1ead0-bf09-4f2f-af65-52c27e8e750e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1df1ead0-bf09-4f2f-af65-52c27e8e750e" (UID: "1df1ead0-bf09-4f2f-af65-52c27e8e750e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.227473 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e6fdb88-9641-4e76-9418-7deae0ca555a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e6fdb88-9641-4e76-9418-7deae0ca555a" (UID: "7e6fdb88-9641-4e76-9418-7deae0ca555a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.233124 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6fdb88-9641-4e76-9418-7deae0ca555a-kube-api-access-9lxvm" (OuterVolumeSpecName: "kube-api-access-9lxvm") pod "7e6fdb88-9641-4e76-9418-7deae0ca555a" (UID: "7e6fdb88-9641-4e76-9418-7deae0ca555a"). InnerVolumeSpecName "kube-api-access-9lxvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.233917 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df1ead0-bf09-4f2f-af65-52c27e8e750e-kube-api-access-zvkjh" (OuterVolumeSpecName: "kube-api-access-zvkjh") pod "1df1ead0-bf09-4f2f-af65-52c27e8e750e" (UID: "1df1ead0-bf09-4f2f-af65-52c27e8e750e"). InnerVolumeSpecName "kube-api-access-zvkjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.309531 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.328810 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lxvm\" (UniqueName: \"kubernetes.io/projected/7e6fdb88-9641-4e76-9418-7deae0ca555a-kube-api-access-9lxvm\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.328839 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e6fdb88-9641-4e76-9418-7deae0ca555a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.328849 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvkjh\" (UniqueName: \"kubernetes.io/projected/1df1ead0-bf09-4f2f-af65-52c27e8e750e-kube-api-access-zvkjh\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.328857 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df1ead0-bf09-4f2f-af65-52c27e8e750e-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.430636 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-dns-svc\") pod \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.430713 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-nb\") pod \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.430752 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqkw9\" (UniqueName: \"kubernetes.io/projected/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-kube-api-access-tqkw9\") pod \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.430865 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-sb\") pod \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.430975 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-config\") pod \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\" (UID: \"6ecc155f-d98b-4a43-bf52-c58e0bf367f2\") " Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.434863 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-kube-api-access-tqkw9" (OuterVolumeSpecName: "kube-api-access-tqkw9") pod "6ecc155f-d98b-4a43-bf52-c58e0bf367f2" (UID: "6ecc155f-d98b-4a43-bf52-c58e0bf367f2"). InnerVolumeSpecName "kube-api-access-tqkw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.471281 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ecc155f-d98b-4a43-bf52-c58e0bf367f2" (UID: "6ecc155f-d98b-4a43-bf52-c58e0bf367f2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.471680 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-config" (OuterVolumeSpecName: "config") pod "6ecc155f-d98b-4a43-bf52-c58e0bf367f2" (UID: "6ecc155f-d98b-4a43-bf52-c58e0bf367f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.477359 4807 generic.go:334] "Generic (PLEG): container finished" podID="6ecc155f-d98b-4a43-bf52-c58e0bf367f2" containerID="71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c" exitCode=0 Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.477412 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ecc155f-d98b-4a43-bf52-c58e0bf367f2" (UID: "6ecc155f-d98b-4a43-bf52-c58e0bf367f2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.477449 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" event={"ID":"6ecc155f-d98b-4a43-bf52-c58e0bf367f2","Type":"ContainerDied","Data":"71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c"} Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.477483 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" event={"ID":"6ecc155f-d98b-4a43-bf52-c58e0bf367f2","Type":"ContainerDied","Data":"8519c067211758c60f53649f5111254dfe1aa6d6f35df5b68c807aed9b3a9932"} Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.477499 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zlflh" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.477506 4807 scope.go:117] "RemoveContainer" containerID="71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.480646 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f2270de6-a69c-44be-8fb7-98e10027cd34","Type":"ContainerStarted","Data":"3fe00eb08ff9e8bb988d5bd71312f1a3e5f58978bea2eb085e9c5faecf2aec3e"} Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.480794 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f2270de6-a69c-44be-8fb7-98e10027cd34","Type":"ContainerStarted","Data":"cee67ea112f4ffd4c31c2e372b9b338363d1c6622377dde1e91eba970101c61d"} Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.481029 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.482384 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-j5bh8" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.482571 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-j5bh8" event={"ID":"1df1ead0-bf09-4f2f-af65-52c27e8e750e","Type":"ContainerDied","Data":"68cf779f6ddeca4bbfb5f89fecb5de802c45c0b8038787999c644ba6e6420cb3"} Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.482621 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68cf779f6ddeca4bbfb5f89fecb5de802c45c0b8038787999c644ba6e6420cb3" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.491118 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fchf8" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.491129 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fchf8" event={"ID":"7e6fdb88-9641-4e76-9418-7deae0ca555a","Type":"ContainerDied","Data":"3a329cfe5dd3f4666f831d746951038e50afe58bdfeaa183826449f0fa9d07e4"} Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.491371 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a329cfe5dd3f4666f831d746951038e50afe58bdfeaa183826449f0fa9d07e4" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.498961 4807 scope.go:117] "RemoveContainer" containerID="923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.499518 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ecc155f-d98b-4a43-bf52-c58e0bf367f2" (UID: "6ecc155f-d98b-4a43-bf52-c58e0bf367f2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.508392 4807 generic.go:334] "Generic (PLEG): container finished" podID="4a128226-2a88-4b86-8592-52eb97146227" containerID="f964dd0eb744c72975313463cc7ca3d009eed12263d794eeeeae83172627b01a" exitCode=0 Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.508555 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6vqr9" event={"ID":"4a128226-2a88-4b86-8592-52eb97146227","Type":"ContainerDied","Data":"f964dd0eb744c72975313463cc7ca3d009eed12263d794eeeeae83172627b01a"} Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.512368 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=4.034969686 podStartE2EDuration="5.512349016s" podCreationTimestamp="2025-11-27 11:25:44 +0000 UTC" firstStartedPulling="2025-11-27 11:25:46.719882476 +0000 UTC m=+987.819380674" lastFinishedPulling="2025-11-27 11:25:48.197261806 +0000 UTC m=+989.296760004" observedRunningTime="2025-11-27 11:25:49.50797019 +0000 UTC m=+990.607468398" watchObservedRunningTime="2025-11-27 11:25:49.512349016 +0000 UTC m=+990.611847214" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.535190 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.535223 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.535236 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.535268 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqkw9\" (UniqueName: \"kubernetes.io/projected/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-kube-api-access-tqkw9\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.535281 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ecc155f-d98b-4a43-bf52-c58e0bf367f2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.540805 4807 scope.go:117] "RemoveContainer" containerID="71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c" Nov 27 11:25:49 crc kubenswrapper[4807]: E1127 11:25:49.551720 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c\": container with ID starting with 71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c not found: ID does not exist" containerID="71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.551785 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c"} err="failed to get container status \"71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c\": rpc error: code = NotFound desc = could not find container \"71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c\": container with ID starting with 71baeb89310b77e48a443bf1334c6b0c7b0eb20fbce7702798ac00e81309927c not found: ID does not exist" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.551820 4807 scope.go:117] "RemoveContainer" containerID="923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8" Nov 27 11:25:49 crc kubenswrapper[4807]: E1127 11:25:49.568741 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8\": container with ID starting with 923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8 not found: ID does not exist" containerID="923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.568791 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8"} err="failed to get container status \"923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8\": rpc error: code = NotFound desc = could not find container \"923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8\": container with ID starting with 923e5ab7be62b77e9612b6d6d9dbf707d54316654e93bbda8891b645f29c32b8 not found: ID does not exist" Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.804319 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlflh"] Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.813877 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zlflh"] Nov 27 11:25:49 crc kubenswrapper[4807]: I1127 11:25:49.949960 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8abf-account-create-update-kvbpv" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.150117 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-operator-scripts\") pod \"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7\" (UID: \"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.150219 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scfw2\" (UniqueName: \"kubernetes.io/projected/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-kube-api-access-scfw2\") pod \"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7\" (UID: \"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.153112 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7" (UID: "0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.158621 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-kube-api-access-scfw2" (OuterVolumeSpecName: "kube-api-access-scfw2") pod "0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7" (UID: "0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7"). InnerVolumeSpecName "kube-api-access-scfw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.231176 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c8cvq" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.241227 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb85-account-create-update-fsqpx" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.252476 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.252501 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scfw2\" (UniqueName: \"kubernetes.io/projected/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7-kube-api-access-scfw2\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.254800 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4qq8v" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.264403 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b10b-account-create-update-9ck9q" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.279132 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8534-account-create-update-q2bbw" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.289763 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3b55-account-create-update-q96ln" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.353707 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-operator-scripts\") pod \"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9\" (UID: \"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.353781 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k4wb\" (UniqueName: \"kubernetes.io/projected/b61fd3b7-c996-4009-8e5a-208bba68066f-kube-api-access-8k4wb\") pod \"b61fd3b7-c996-4009-8e5a-208bba68066f\" (UID: \"b61fd3b7-c996-4009-8e5a-208bba68066f\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.353834 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61fd3b7-c996-4009-8e5a-208bba68066f-operator-scripts\") pod \"b61fd3b7-c996-4009-8e5a-208bba68066f\" (UID: \"b61fd3b7-c996-4009-8e5a-208bba68066f\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.353960 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kvt5\" (UniqueName: \"kubernetes.io/projected/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-kube-api-access-8kvt5\") pod \"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9\" (UID: \"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.354633 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b61fd3b7-c996-4009-8e5a-208bba68066f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b61fd3b7-c996-4009-8e5a-208bba68066f" (UID: "b61fd3b7-c996-4009-8e5a-208bba68066f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.354724 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6d2a7ab-579e-4690-8067-7ba5c08cf3c9" (UID: "e6d2a7ab-579e-4690-8067-7ba5c08cf3c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.356979 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61fd3b7-c996-4009-8e5a-208bba68066f-kube-api-access-8k4wb" (OuterVolumeSpecName: "kube-api-access-8k4wb") pod "b61fd3b7-c996-4009-8e5a-208bba68066f" (UID: "b61fd3b7-c996-4009-8e5a-208bba68066f"). InnerVolumeSpecName "kube-api-access-8k4wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.357303 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-kube-api-access-8kvt5" (OuterVolumeSpecName: "kube-api-access-8kvt5") pod "e6d2a7ab-579e-4690-8067-7ba5c08cf3c9" (UID: "e6d2a7ab-579e-4690-8067-7ba5c08cf3c9"). InnerVolumeSpecName "kube-api-access-8kvt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.454772 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q25jq\" (UniqueName: \"kubernetes.io/projected/8ab6613c-de00-44b8-9a65-cff272bb76e8-kube-api-access-q25jq\") pod \"8ab6613c-de00-44b8-9a65-cff272bb76e8\" (UID: \"8ab6613c-de00-44b8-9a65-cff272bb76e8\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.454813 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ab6613c-de00-44b8-9a65-cff272bb76e8-operator-scripts\") pod \"8ab6613c-de00-44b8-9a65-cff272bb76e8\" (UID: \"8ab6613c-de00-44b8-9a65-cff272bb76e8\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.454859 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh8k4\" (UniqueName: \"kubernetes.io/projected/d67782cf-87a3-44fe-88c8-16d66f224029-kube-api-access-sh8k4\") pod \"d67782cf-87a3-44fe-88c8-16d66f224029\" (UID: \"d67782cf-87a3-44fe-88c8-16d66f224029\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.454918 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vl82\" (UniqueName: \"kubernetes.io/projected/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-kube-api-access-9vl82\") pod \"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86\" (UID: \"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.454960 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d67782cf-87a3-44fe-88c8-16d66f224029-operator-scripts\") pod \"d67782cf-87a3-44fe-88c8-16d66f224029\" (UID: \"d67782cf-87a3-44fe-88c8-16d66f224029\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.454981 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00b9f61b-2731-4391-9be9-56437aa380e7-operator-scripts\") pod \"00b9f61b-2731-4391-9be9-56437aa380e7\" (UID: \"00b9f61b-2731-4391-9be9-56437aa380e7\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.455078 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkp4g\" (UniqueName: \"kubernetes.io/projected/00b9f61b-2731-4391-9be9-56437aa380e7-kube-api-access-fkp4g\") pod \"00b9f61b-2731-4391-9be9-56437aa380e7\" (UID: \"00b9f61b-2731-4391-9be9-56437aa380e7\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.455148 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-operator-scripts\") pod \"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86\" (UID: \"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86\") " Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.455854 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d67782cf-87a3-44fe-88c8-16d66f224029-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d67782cf-87a3-44fe-88c8-16d66f224029" (UID: "d67782cf-87a3-44fe-88c8-16d66f224029"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.455855 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00b9f61b-2731-4391-9be9-56437aa380e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "00b9f61b-2731-4391-9be9-56437aa380e7" (UID: "00b9f61b-2731-4391-9be9-56437aa380e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.455956 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86" (UID: "cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.456177 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b61fd3b7-c996-4009-8e5a-208bba68066f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.456196 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.456208 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kvt5\" (UniqueName: \"kubernetes.io/projected/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-kube-api-access-8kvt5\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.456224 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d67782cf-87a3-44fe-88c8-16d66f224029-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.456235 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00b9f61b-2731-4391-9be9-56437aa380e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.456268 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.456281 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k4wb\" (UniqueName: \"kubernetes.io/projected/b61fd3b7-c996-4009-8e5a-208bba68066f-kube-api-access-8k4wb\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.456781 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ab6613c-de00-44b8-9a65-cff272bb76e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ab6613c-de00-44b8-9a65-cff272bb76e8" (UID: "8ab6613c-de00-44b8-9a65-cff272bb76e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.460101 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d67782cf-87a3-44fe-88c8-16d66f224029-kube-api-access-sh8k4" (OuterVolumeSpecName: "kube-api-access-sh8k4") pod "d67782cf-87a3-44fe-88c8-16d66f224029" (UID: "d67782cf-87a3-44fe-88c8-16d66f224029"). InnerVolumeSpecName "kube-api-access-sh8k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.460163 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b9f61b-2731-4391-9be9-56437aa380e7-kube-api-access-fkp4g" (OuterVolumeSpecName: "kube-api-access-fkp4g") pod "00b9f61b-2731-4391-9be9-56437aa380e7" (UID: "00b9f61b-2731-4391-9be9-56437aa380e7"). InnerVolumeSpecName "kube-api-access-fkp4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.460196 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ab6613c-de00-44b8-9a65-cff272bb76e8-kube-api-access-q25jq" (OuterVolumeSpecName: "kube-api-access-q25jq") pod "8ab6613c-de00-44b8-9a65-cff272bb76e8" (UID: "8ab6613c-de00-44b8-9a65-cff272bb76e8"). InnerVolumeSpecName "kube-api-access-q25jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.460212 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-kube-api-access-9vl82" (OuterVolumeSpecName: "kube-api-access-9vl82") pod "cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86" (UID: "cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86"). InnerVolumeSpecName "kube-api-access-9vl82". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.516832 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3b55-account-create-update-q96ln" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.520306 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3b55-account-create-update-q96ln" event={"ID":"cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86","Type":"ContainerDied","Data":"64ae97d9101726e3c90b066ce2fefb11616f336af0b450065c3622f082b56e3b"} Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.520334 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64ae97d9101726e3c90b066ce2fefb11616f336af0b450065c3622f082b56e3b" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.538183 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4qq8v" event={"ID":"8ab6613c-de00-44b8-9a65-cff272bb76e8","Type":"ContainerDied","Data":"ba11fc6b054cb048b1ebd51318d53447e83dbf3fdb9fd69037be9ac9cf0cc2ef"} Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.538216 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba11fc6b054cb048b1ebd51318d53447e83dbf3fdb9fd69037be9ac9cf0cc2ef" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.538262 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4qq8v" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.541676 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c8cvq" event={"ID":"b61fd3b7-c996-4009-8e5a-208bba68066f","Type":"ContainerDied","Data":"0e0186d7a9a435f344ec4a44c1fc648cf20bb97b843293d9c48b536b372d87a8"} Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.541721 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e0186d7a9a435f344ec4a44c1fc648cf20bb97b843293d9c48b536b372d87a8" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.541685 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c8cvq" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.544328 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-8abf-account-create-update-kvbpv" event={"ID":"0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7","Type":"ContainerDied","Data":"7fad5799f0cceb18eb0c246325f2c21b384a4b78f0f8bf74805db2506e7ca1cf"} Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.544356 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fad5799f0cceb18eb0c246325f2c21b384a4b78f0f8bf74805db2506e7ca1cf" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.544455 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-8abf-account-create-update-kvbpv" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.546231 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fb85-account-create-update-fsqpx" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.546237 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fb85-account-create-update-fsqpx" event={"ID":"e6d2a7ab-579e-4690-8067-7ba5c08cf3c9","Type":"ContainerDied","Data":"5beb960856e2709a13b6e5c89f2841e7ff00dddc86d0e94b2774fc7896ca0b91"} Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.546315 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5beb960856e2709a13b6e5c89f2841e7ff00dddc86d0e94b2774fc7896ca0b91" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.548057 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8534-account-create-update-q2bbw" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.548113 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8534-account-create-update-q2bbw" event={"ID":"00b9f61b-2731-4391-9be9-56437aa380e7","Type":"ContainerDied","Data":"d5cfb83aab4c7af98501991fb9bf96df42058885710bd6e1c7e94a9c4257b08e"} Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.548167 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5cfb83aab4c7af98501991fb9bf96df42058885710bd6e1c7e94a9c4257b08e" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.550372 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-b10b-account-create-update-9ck9q" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.556999 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-b10b-account-create-update-9ck9q" event={"ID":"d67782cf-87a3-44fe-88c8-16d66f224029","Type":"ContainerDied","Data":"2877535684b5f2e71ce218847fdf9874a8fc7514f1a841332f31939214c40229"} Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.557038 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2877535684b5f2e71ce218847fdf9874a8fc7514f1a841332f31939214c40229" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.558104 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vl82\" (UniqueName: \"kubernetes.io/projected/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86-kube-api-access-9vl82\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.558116 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkp4g\" (UniqueName: \"kubernetes.io/projected/00b9f61b-2731-4391-9be9-56437aa380e7-kube-api-access-fkp4g\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.558127 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q25jq\" (UniqueName: \"kubernetes.io/projected/8ab6613c-de00-44b8-9a65-cff272bb76e8-kube-api-access-q25jq\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.558135 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ab6613c-de00-44b8-9a65-cff272bb76e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.558144 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh8k4\" (UniqueName: \"kubernetes.io/projected/d67782cf-87a3-44fe-88c8-16d66f224029-kube-api-access-sh8k4\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.886768 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6vqr9" Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.921435 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:25:50 crc kubenswrapper[4807]: I1127 11:25:50.921513 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.063981 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a128226-2a88-4b86-8592-52eb97146227-operator-scripts\") pod \"4a128226-2a88-4b86-8592-52eb97146227\" (UID: \"4a128226-2a88-4b86-8592-52eb97146227\") " Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.064043 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prjxj\" (UniqueName: \"kubernetes.io/projected/4a128226-2a88-4b86-8592-52eb97146227-kube-api-access-prjxj\") pod \"4a128226-2a88-4b86-8592-52eb97146227\" (UID: \"4a128226-2a88-4b86-8592-52eb97146227\") " Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.065374 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a128226-2a88-4b86-8592-52eb97146227-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a128226-2a88-4b86-8592-52eb97146227" (UID: "4a128226-2a88-4b86-8592-52eb97146227"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.075887 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a128226-2a88-4b86-8592-52eb97146227-kube-api-access-prjxj" (OuterVolumeSpecName: "kube-api-access-prjxj") pod "4a128226-2a88-4b86-8592-52eb97146227" (UID: "4a128226-2a88-4b86-8592-52eb97146227"). InnerVolumeSpecName "kube-api-access-prjxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.170268 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a128226-2a88-4b86-8592-52eb97146227-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.170308 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prjxj\" (UniqueName: \"kubernetes.io/projected/4a128226-2a88-4b86-8592-52eb97146227-kube-api-access-prjxj\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.566728 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ecc155f-d98b-4a43-bf52-c58e0bf367f2" path="/var/lib/kubelet/pods/6ecc155f-d98b-4a43-bf52-c58e0bf367f2/volumes" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.570096 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-6vqr9" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.570592 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-6vqr9" event={"ID":"4a128226-2a88-4b86-8592-52eb97146227","Type":"ContainerDied","Data":"4646133b714be6cba396cbecf855874a272feaf1461d559fda58dab16a969e2f"} Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.570628 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4646133b714be6cba396cbecf855874a272feaf1461d559fda58dab16a969e2f" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955035 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xnw86"] Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955450 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955468 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955484 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00b9f61b-2731-4391-9be9-56437aa380e7" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955493 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b9f61b-2731-4391-9be9-56437aa380e7" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955503 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ecc155f-d98b-4a43-bf52-c58e0bf367f2" containerName="dnsmasq-dns" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955510 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ecc155f-d98b-4a43-bf52-c58e0bf367f2" containerName="dnsmasq-dns" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955520 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d2a7ab-579e-4690-8067-7ba5c08cf3c9" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955527 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d2a7ab-579e-4690-8067-7ba5c08cf3c9" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955539 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ecc155f-d98b-4a43-bf52-c58e0bf367f2" containerName="init" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955545 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ecc155f-d98b-4a43-bf52-c58e0bf367f2" containerName="init" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955568 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d67782cf-87a3-44fe-88c8-16d66f224029" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955576 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d67782cf-87a3-44fe-88c8-16d66f224029" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955592 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ab6613c-de00-44b8-9a65-cff272bb76e8" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955599 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ab6613c-de00-44b8-9a65-cff272bb76e8" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955607 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b61fd3b7-c996-4009-8e5a-208bba68066f" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955614 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61fd3b7-c996-4009-8e5a-208bba68066f" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955626 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6fdb88-9641-4e76-9418-7deae0ca555a" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955633 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6fdb88-9641-4e76-9418-7deae0ca555a" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955647 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a128226-2a88-4b86-8592-52eb97146227" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955655 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a128226-2a88-4b86-8592-52eb97146227" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955670 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955677 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: E1127 11:25:51.955684 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df1ead0-bf09-4f2f-af65-52c27e8e750e" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955692 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df1ead0-bf09-4f2f-af65-52c27e8e750e" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955938 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6fdb88-9641-4e76-9418-7deae0ca555a" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955960 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.955991 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ecc155f-d98b-4a43-bf52-c58e0bf367f2" containerName="dnsmasq-dns" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.956003 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d67782cf-87a3-44fe-88c8-16d66f224029" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.956020 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ab6613c-de00-44b8-9a65-cff272bb76e8" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.956034 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df1ead0-bf09-4f2f-af65-52c27e8e750e" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.956052 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="00b9f61b-2731-4391-9be9-56437aa380e7" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.956064 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.956084 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d2a7ab-579e-4690-8067-7ba5c08cf3c9" containerName="mariadb-account-create-update" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.956094 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a128226-2a88-4b86-8592-52eb97146227" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.956111 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b61fd3b7-c996-4009-8e5a-208bba68066f" containerName="mariadb-database-create" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.956763 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.961023 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.961305 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b87v8" Nov 27 11:25:51 crc kubenswrapper[4807]: I1127 11:25:51.970615 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xnw86"] Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.113990 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-combined-ca-bundle\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.114312 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-db-sync-config-data\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.114366 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9znw8\" (UniqueName: \"kubernetes.io/projected/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-kube-api-access-9znw8\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.114398 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-config-data\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.215854 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-db-sync-config-data\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.215895 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9znw8\" (UniqueName: \"kubernetes.io/projected/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-kube-api-access-9znw8\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.215920 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-config-data\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.216005 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-combined-ca-bundle\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.219078 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-db-sync-config-data\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.219528 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-config-data\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.219722 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-combined-ca-bundle\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.237059 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9znw8\" (UniqueName: \"kubernetes.io/projected/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-kube-api-access-9znw8\") pod \"glance-db-sync-xnw86\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.275487 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnw86" Nov 27 11:25:52 crc kubenswrapper[4807]: I1127 11:25:52.756864 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xnw86"] Nov 27 11:25:52 crc kubenswrapper[4807]: W1127 11:25:52.761920 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod123dc36c_92d1_4c98_9c02_ed2c4fbbff27.slice/crio-cd451455087df42ae52978fea10347b613b445747dc6b32c9be8076dcd081932 WatchSource:0}: Error finding container cd451455087df42ae52978fea10347b613b445747dc6b32c9be8076dcd081932: Status 404 returned error can't find the container with id cd451455087df42ae52978fea10347b613b445747dc6b32c9be8076dcd081932 Nov 27 11:25:53 crc kubenswrapper[4807]: I1127 11:25:53.584767 4807 generic.go:334] "Generic (PLEG): container finished" podID="b83dff2f-801e-4a9b-9427-48e1f51bcc79" containerID="d9593d8653907de4a777ae01853d668479e418c78b904a234f96c6aa9fffe297" exitCode=0 Nov 27 11:25:53 crc kubenswrapper[4807]: I1127 11:25:53.584856 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6hg29" event={"ID":"b83dff2f-801e-4a9b-9427-48e1f51bcc79","Type":"ContainerDied","Data":"d9593d8653907de4a777ae01853d668479e418c78b904a234f96c6aa9fffe297"} Nov 27 11:25:53 crc kubenswrapper[4807]: I1127 11:25:53.586295 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnw86" event={"ID":"123dc36c-92d1-4c98-9c02-ed2c4fbbff27","Type":"ContainerStarted","Data":"cd451455087df42ae52978fea10347b613b445747dc6b32c9be8076dcd081932"} Nov 27 11:25:54 crc kubenswrapper[4807]: I1127 11:25:54.951603 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.068907 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-dispersionconf\") pod \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.068989 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-swiftconf\") pod \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.069015 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-scripts\") pod \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.069047 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-ring-data-devices\") pod \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.069127 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-448ng\" (UniqueName: \"kubernetes.io/projected/b83dff2f-801e-4a9b-9427-48e1f51bcc79-kube-api-access-448ng\") pod \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.069183 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b83dff2f-801e-4a9b-9427-48e1f51bcc79-etc-swift\") pod \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.069215 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-combined-ca-bundle\") pod \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\" (UID: \"b83dff2f-801e-4a9b-9427-48e1f51bcc79\") " Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.072117 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b83dff2f-801e-4a9b-9427-48e1f51bcc79" (UID: "b83dff2f-801e-4a9b-9427-48e1f51bcc79"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.072434 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b83dff2f-801e-4a9b-9427-48e1f51bcc79-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b83dff2f-801e-4a9b-9427-48e1f51bcc79" (UID: "b83dff2f-801e-4a9b-9427-48e1f51bcc79"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.074995 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b83dff2f-801e-4a9b-9427-48e1f51bcc79-kube-api-access-448ng" (OuterVolumeSpecName: "kube-api-access-448ng") pod "b83dff2f-801e-4a9b-9427-48e1f51bcc79" (UID: "b83dff2f-801e-4a9b-9427-48e1f51bcc79"). InnerVolumeSpecName "kube-api-access-448ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.081784 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b83dff2f-801e-4a9b-9427-48e1f51bcc79" (UID: "b83dff2f-801e-4a9b-9427-48e1f51bcc79"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.098474 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b83dff2f-801e-4a9b-9427-48e1f51bcc79" (UID: "b83dff2f-801e-4a9b-9427-48e1f51bcc79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.101643 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-scripts" (OuterVolumeSpecName: "scripts") pod "b83dff2f-801e-4a9b-9427-48e1f51bcc79" (UID: "b83dff2f-801e-4a9b-9427-48e1f51bcc79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.105239 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b83dff2f-801e-4a9b-9427-48e1f51bcc79" (UID: "b83dff2f-801e-4a9b-9427-48e1f51bcc79"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.171588 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-448ng\" (UniqueName: \"kubernetes.io/projected/b83dff2f-801e-4a9b-9427-48e1f51bcc79-kube-api-access-448ng\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.171642 4807 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b83dff2f-801e-4a9b-9427-48e1f51bcc79-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.171655 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.171663 4807 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.171672 4807 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b83dff2f-801e-4a9b-9427-48e1f51bcc79-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.171681 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.171690 4807 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b83dff2f-801e-4a9b-9427-48e1f51bcc79-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.578621 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.583715 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc29fb6b-2886-4d51-8afd-be8fc1069ee4-etc-swift\") pod \"swift-storage-0\" (UID: \"bc29fb6b-2886-4d51-8afd-be8fc1069ee4\") " pod="openstack/swift-storage-0" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.604759 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6hg29" event={"ID":"b83dff2f-801e-4a9b-9427-48e1f51bcc79","Type":"ContainerDied","Data":"0ed382d648f2dfe7500fa89fddca6caeb0bfba3032ff9f176a991ff5ee1e3654"} Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.604988 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed382d648f2dfe7500fa89fddca6caeb0bfba3032ff9f176a991ff5ee1e3654" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.605064 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6hg29" Nov 27 11:25:55 crc kubenswrapper[4807]: I1127 11:25:55.794769 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.307427 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 27 11:25:56 crc kubenswrapper[4807]: W1127 11:25:56.318600 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc29fb6b_2886_4d51_8afd_be8fc1069ee4.slice/crio-9c9bb044955166eda7b93970ea3aa51133a995c305ae0bd915213715b7ed606a WatchSource:0}: Error finding container 9c9bb044955166eda7b93970ea3aa51133a995c305ae0bd915213715b7ed606a: Status 404 returned error can't find the container with id 9c9bb044955166eda7b93970ea3aa51133a995c305ae0bd915213715b7ed606a Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.615546 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"9c9bb044955166eda7b93970ea3aa51133a995c305ae0bd915213715b7ed606a"} Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.966383 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-67wm7"] Nov 27 11:25:56 crc kubenswrapper[4807]: E1127 11:25:56.966723 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b83dff2f-801e-4a9b-9427-48e1f51bcc79" containerName="swift-ring-rebalance" Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.966740 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b83dff2f-801e-4a9b-9427-48e1f51bcc79" containerName="swift-ring-rebalance" Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.966937 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b83dff2f-801e-4a9b-9427-48e1f51bcc79" containerName="swift-ring-rebalance" Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.967583 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.970408 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.970450 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-msqwq" Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.970461 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.972873 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 11:25:56 crc kubenswrapper[4807]: I1127 11:25:56.986969 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-67wm7"] Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.113287 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-config-data\") pod \"keystone-db-sync-67wm7\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.113358 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-combined-ca-bundle\") pod \"keystone-db-sync-67wm7\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.113383 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dz7x\" (UniqueName: \"kubernetes.io/projected/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-kube-api-access-4dz7x\") pod \"keystone-db-sync-67wm7\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.214933 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-combined-ca-bundle\") pod \"keystone-db-sync-67wm7\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.214981 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dz7x\" (UniqueName: \"kubernetes.io/projected/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-kube-api-access-4dz7x\") pod \"keystone-db-sync-67wm7\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.215096 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-config-data\") pod \"keystone-db-sync-67wm7\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.222630 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-combined-ca-bundle\") pod \"keystone-db-sync-67wm7\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.223811 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-config-data\") pod \"keystone-db-sync-67wm7\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.231352 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dz7x\" (UniqueName: \"kubernetes.io/projected/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-kube-api-access-4dz7x\") pod \"keystone-db-sync-67wm7\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.287549 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-67wm7" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.561164 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.599021 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-26rzj" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.811728 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-64nw4-config-mjsls"] Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.814447 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.831376 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-64nw4-config-mjsls"] Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.861070 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.935150 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-additional-scripts\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.935208 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q77pn\" (UniqueName: \"kubernetes.io/projected/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-kube-api-access-q77pn\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.935236 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-scripts\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.935295 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.935373 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run-ovn\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.935409 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-log-ovn\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:57 crc kubenswrapper[4807]: I1127 11:25:57.981238 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-67wm7"] Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.036770 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-additional-scripts\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.036836 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q77pn\" (UniqueName: \"kubernetes.io/projected/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-kube-api-access-q77pn\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.036861 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-scripts\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.036898 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.036985 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run-ovn\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.037026 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-log-ovn\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.037301 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-log-ovn\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.037309 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.037438 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run-ovn\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.038992 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-scripts\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.039474 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-additional-scripts\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.057139 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q77pn\" (UniqueName: \"kubernetes.io/projected/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-kube-api-access-q77pn\") pod \"ovn-controller-64nw4-config-mjsls\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.185072 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.630543 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"031764d1330f1f462cd71f3fdac16422ea9536d452ec0237c15d6b3e45cd5310"} Nov 27 11:25:58 crc kubenswrapper[4807]: I1127 11:25:58.632765 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-67wm7" event={"ID":"ca7e9a2b-6e64-4497-8003-8c9aaaf37806","Type":"ContainerStarted","Data":"1ad57670c2abcff6f87b132d3814973f1996d4f53c5b36262cc9a4630b637c92"} Nov 27 11:25:59 crc kubenswrapper[4807]: I1127 11:25:59.490323 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 27 11:26:04 crc kubenswrapper[4807]: I1127 11:26:04.639631 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-64nw4-config-mjsls"] Nov 27 11:26:04 crc kubenswrapper[4807]: W1127 11:26:04.650613 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b4e04f2_be0b_48c2_bac4_ec5d668f9378.slice/crio-d7df1da3b5cddbd46f8b2967d351cefe642b9ddec82c51b4ce95cafd8132f9b1 WatchSource:0}: Error finding container d7df1da3b5cddbd46f8b2967d351cefe642b9ddec82c51b4ce95cafd8132f9b1: Status 404 returned error can't find the container with id d7df1da3b5cddbd46f8b2967d351cefe642b9ddec82c51b4ce95cafd8132f9b1 Nov 27 11:26:04 crc kubenswrapper[4807]: I1127 11:26:04.698850 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-64nw4-config-mjsls" event={"ID":"3b4e04f2-be0b-48c2-bac4-ec5d668f9378","Type":"ContainerStarted","Data":"d7df1da3b5cddbd46f8b2967d351cefe642b9ddec82c51b4ce95cafd8132f9b1"} Nov 27 11:26:04 crc kubenswrapper[4807]: I1127 11:26:04.701426 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"f853527e0687f8d5f618d6b662ef4e8200bd1527e72116effdfc7d0f14991473"} Nov 27 11:26:04 crc kubenswrapper[4807]: I1127 11:26:04.701449 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"588220b020d0e422ae666594c890368bf6127f2433dabdff680e4cec5d7e8e70"} Nov 27 11:26:04 crc kubenswrapper[4807]: I1127 11:26:04.701458 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"150cffc6476a4f32f09bc7e8fc8e355150546f56e74f178796a01ad9a19dd12a"} Nov 27 11:26:05 crc kubenswrapper[4807]: I1127 11:26:05.713113 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnw86" event={"ID":"123dc36c-92d1-4c98-9c02-ed2c4fbbff27","Type":"ContainerStarted","Data":"76882f63bebb00e995acda55fe9e326376a9adaf2bb13424aafc6775bee1da48"} Nov 27 11:26:05 crc kubenswrapper[4807]: I1127 11:26:05.715383 4807 generic.go:334] "Generic (PLEG): container finished" podID="3b4e04f2-be0b-48c2-bac4-ec5d668f9378" containerID="5f0391f8213a2b3e2666161297485f2946f5f082a36ee68c584d63c5c389296b" exitCode=0 Nov 27 11:26:05 crc kubenswrapper[4807]: I1127 11:26:05.715492 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-64nw4-config-mjsls" event={"ID":"3b4e04f2-be0b-48c2-bac4-ec5d668f9378","Type":"ContainerDied","Data":"5f0391f8213a2b3e2666161297485f2946f5f082a36ee68c584d63c5c389296b"} Nov 27 11:26:05 crc kubenswrapper[4807]: I1127 11:26:05.728275 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xnw86" podStartSLOduration=3.142172944 podStartE2EDuration="14.728240296s" podCreationTimestamp="2025-11-27 11:25:51 +0000 UTC" firstStartedPulling="2025-11-27 11:25:52.764224552 +0000 UTC m=+993.863722740" lastFinishedPulling="2025-11-27 11:26:04.350291894 +0000 UTC m=+1005.449790092" observedRunningTime="2025-11-27 11:26:05.726230373 +0000 UTC m=+1006.825728591" watchObservedRunningTime="2025-11-27 11:26:05.728240296 +0000 UTC m=+1006.827738484" Nov 27 11:26:07 crc kubenswrapper[4807]: I1127 11:26:07.587747 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-64nw4" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.239904 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.405575 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-additional-scripts\") pod \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.405744 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-log-ovn\") pod \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.405781 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q77pn\" (UniqueName: \"kubernetes.io/projected/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-kube-api-access-q77pn\") pod \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.405822 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-scripts\") pod \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.405884 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run-ovn\") pod \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.405926 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run\") pod \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\" (UID: \"3b4e04f2-be0b-48c2-bac4-ec5d668f9378\") " Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.406272 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3b4e04f2-be0b-48c2-bac4-ec5d668f9378" (UID: "3b4e04f2-be0b-48c2-bac4-ec5d668f9378"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.406312 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run" (OuterVolumeSpecName: "var-run") pod "3b4e04f2-be0b-48c2-bac4-ec5d668f9378" (UID: "3b4e04f2-be0b-48c2-bac4-ec5d668f9378"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.406303 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3b4e04f2-be0b-48c2-bac4-ec5d668f9378" (UID: "3b4e04f2-be0b-48c2-bac4-ec5d668f9378"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.406635 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3b4e04f2-be0b-48c2-bac4-ec5d668f9378" (UID: "3b4e04f2-be0b-48c2-bac4-ec5d668f9378"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.406926 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-scripts" (OuterVolumeSpecName: "scripts") pod "3b4e04f2-be0b-48c2-bac4-ec5d668f9378" (UID: "3b4e04f2-be0b-48c2-bac4-ec5d668f9378"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.407294 4807 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.407312 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.407322 4807 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.407330 4807 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-var-run\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.407339 4807 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.410566 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-kube-api-access-q77pn" (OuterVolumeSpecName: "kube-api-access-q77pn") pod "3b4e04f2-be0b-48c2-bac4-ec5d668f9378" (UID: "3b4e04f2-be0b-48c2-bac4-ec5d668f9378"). InnerVolumeSpecName "kube-api-access-q77pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.509438 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q77pn\" (UniqueName: \"kubernetes.io/projected/3b4e04f2-be0b-48c2-bac4-ec5d668f9378-kube-api-access-q77pn\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.744476 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"06197ebc19295f840bf7a08b074a8170fbb5d18f70efa5451a6d16393063e7bc"} Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.744560 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"89e3c96c7c11b54a4544825b1042a0259ed1f71f0396531d19e65ed43096db84"} Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.744570 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"619fb7eca6cff5fd79a608b6eaa85b8a90bf51b1dbf777e8c2f1ce6dd0304d31"} Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.747157 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-67wm7" event={"ID":"ca7e9a2b-6e64-4497-8003-8c9aaaf37806","Type":"ContainerStarted","Data":"a0179e71faadcd429ddb3d2c5def194609426f5702bf4b4104c8b140249f676a"} Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.749022 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-64nw4-config-mjsls" event={"ID":"3b4e04f2-be0b-48c2-bac4-ec5d668f9378","Type":"ContainerDied","Data":"d7df1da3b5cddbd46f8b2967d351cefe642b9ddec82c51b4ce95cafd8132f9b1"} Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.749047 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7df1da3b5cddbd46f8b2967d351cefe642b9ddec82c51b4ce95cafd8132f9b1" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.749136 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-64nw4-config-mjsls" Nov 27 11:26:08 crc kubenswrapper[4807]: I1127 11:26:08.769381 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-67wm7" podStartSLOduration=2.677617746 podStartE2EDuration="12.76936507s" podCreationTimestamp="2025-11-27 11:25:56 +0000 UTC" firstStartedPulling="2025-11-27 11:25:57.978785229 +0000 UTC m=+999.078283427" lastFinishedPulling="2025-11-27 11:26:08.070532553 +0000 UTC m=+1009.170030751" observedRunningTime="2025-11-27 11:26:08.764021218 +0000 UTC m=+1009.863519416" watchObservedRunningTime="2025-11-27 11:26:08.76936507 +0000 UTC m=+1009.868863268" Nov 27 11:26:09 crc kubenswrapper[4807]: I1127 11:26:09.366820 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-64nw4-config-mjsls"] Nov 27 11:26:09 crc kubenswrapper[4807]: I1127 11:26:09.373454 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-64nw4-config-mjsls"] Nov 27 11:26:09 crc kubenswrapper[4807]: I1127 11:26:09.541721 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b4e04f2-be0b-48c2-bac4-ec5d668f9378" path="/var/lib/kubelet/pods/3b4e04f2-be0b-48c2-bac4-ec5d668f9378/volumes" Nov 27 11:26:09 crc kubenswrapper[4807]: I1127 11:26:09.760891 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"947ff50c99dd88da63f760b3df6986addb50db4812ec5f58960b22de5ebec325"} Nov 27 11:26:14 crc kubenswrapper[4807]: I1127 11:26:14.831970 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"ad0c58323d6d90bce61605f50547289ec537b74aee8191a1c2462d03c2c4830b"} Nov 27 11:26:15 crc kubenswrapper[4807]: I1127 11:26:15.856141 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"8b8f238bc9a96d6c9ab59507bd51ef1c026fa6a5961ff7f15b2d7afcb0e274b6"} Nov 27 11:26:15 crc kubenswrapper[4807]: I1127 11:26:15.856188 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"83a5aa3c1ccbca818255499eb6ef4759f9f356d060ebe98e5669989b6d6158b6"} Nov 27 11:26:15 crc kubenswrapper[4807]: I1127 11:26:15.856200 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"30e3f16a84fb43d8db5daf72bc7569977b5e89b113951cb6376c0f32483c245a"} Nov 27 11:26:15 crc kubenswrapper[4807]: I1127 11:26:15.856208 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"7b24b3a9a45e8553a0e8f266d5e306f7a87a2e9e84fa8d7549ee1b26323a180d"} Nov 27 11:26:15 crc kubenswrapper[4807]: I1127 11:26:15.856217 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"a5113ce1a77d8c197268942a815d9c7bdff0e61084ffb030a7ce2aa35798b565"} Nov 27 11:26:16 crc kubenswrapper[4807]: I1127 11:26:16.886601 4807 generic.go:334] "Generic (PLEG): container finished" podID="ca7e9a2b-6e64-4497-8003-8c9aaaf37806" containerID="a0179e71faadcd429ddb3d2c5def194609426f5702bf4b4104c8b140249f676a" exitCode=0 Nov 27 11:26:16 crc kubenswrapper[4807]: I1127 11:26:16.886688 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-67wm7" event={"ID":"ca7e9a2b-6e64-4497-8003-8c9aaaf37806","Type":"ContainerDied","Data":"a0179e71faadcd429ddb3d2c5def194609426f5702bf4b4104c8b140249f676a"} Nov 27 11:26:16 crc kubenswrapper[4807]: I1127 11:26:16.889792 4807 generic.go:334] "Generic (PLEG): container finished" podID="123dc36c-92d1-4c98-9c02-ed2c4fbbff27" containerID="76882f63bebb00e995acda55fe9e326376a9adaf2bb13424aafc6775bee1da48" exitCode=0 Nov 27 11:26:16 crc kubenswrapper[4807]: I1127 11:26:16.889895 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnw86" event={"ID":"123dc36c-92d1-4c98-9c02-ed2c4fbbff27","Type":"ContainerDied","Data":"76882f63bebb00e995acda55fe9e326376a9adaf2bb13424aafc6775bee1da48"} Nov 27 11:26:16 crc kubenswrapper[4807]: I1127 11:26:16.911057 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"bc29fb6b-2886-4d51-8afd-be8fc1069ee4","Type":"ContainerStarted","Data":"9de5772530f62bd34cfd64cfe9345cca9d0ddc7359d026af186d9c5b38eae6b8"} Nov 27 11:26:16 crc kubenswrapper[4807]: I1127 11:26:16.986967 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.64371712 podStartE2EDuration="38.986941933s" podCreationTimestamp="2025-11-27 11:25:38 +0000 UTC" firstStartedPulling="2025-11-27 11:25:56.321839531 +0000 UTC m=+997.421337719" lastFinishedPulling="2025-11-27 11:26:14.665064334 +0000 UTC m=+1015.764562532" observedRunningTime="2025-11-27 11:26:16.965316111 +0000 UTC m=+1018.064814309" watchObservedRunningTime="2025-11-27 11:26:16.986941933 +0000 UTC m=+1018.086440131" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.251899 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-xzmxr"] Nov 27 11:26:17 crc kubenswrapper[4807]: E1127 11:26:17.252490 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4e04f2-be0b-48c2-bac4-ec5d668f9378" containerName="ovn-config" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.252582 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4e04f2-be0b-48c2-bac4-ec5d668f9378" containerName="ovn-config" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.252830 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4e04f2-be0b-48c2-bac4-ec5d668f9378" containerName="ovn-config" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.253777 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.256593 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.262141 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-xzmxr"] Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.357767 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.358102 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.358131 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-config\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.358173 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsck7\" (UniqueName: \"kubernetes.io/projected/d8e8994e-3b30-4187-b13a-21258a2e8c25-kube-api-access-wsck7\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.358390 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.358471 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-svc\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.460017 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-svc\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.460106 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.460174 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-config\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.460194 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.460236 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsck7\" (UniqueName: \"kubernetes.io/projected/d8e8994e-3b30-4187-b13a-21258a2e8c25-kube-api-access-wsck7\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.460333 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.461034 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.461045 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-svc\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.461342 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.461346 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.461419 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-config\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.498234 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsck7\" (UniqueName: \"kubernetes.io/projected/d8e8994e-3b30-4187-b13a-21258a2e8c25-kube-api-access-wsck7\") pod \"dnsmasq-dns-764c5664d7-xzmxr\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:17 crc kubenswrapper[4807]: I1127 11:26:17.576799 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.005508 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-xzmxr"] Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.198857 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-67wm7" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.305052 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnw86" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.376491 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-combined-ca-bundle\") pod \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.376602 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dz7x\" (UniqueName: \"kubernetes.io/projected/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-kube-api-access-4dz7x\") pod \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.376641 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-config-data\") pod \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\" (UID: \"ca7e9a2b-6e64-4497-8003-8c9aaaf37806\") " Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.380229 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-kube-api-access-4dz7x" (OuterVolumeSpecName: "kube-api-access-4dz7x") pod "ca7e9a2b-6e64-4497-8003-8c9aaaf37806" (UID: "ca7e9a2b-6e64-4497-8003-8c9aaaf37806"). InnerVolumeSpecName "kube-api-access-4dz7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.402716 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca7e9a2b-6e64-4497-8003-8c9aaaf37806" (UID: "ca7e9a2b-6e64-4497-8003-8c9aaaf37806"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.420020 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-config-data" (OuterVolumeSpecName: "config-data") pod "ca7e9a2b-6e64-4497-8003-8c9aaaf37806" (UID: "ca7e9a2b-6e64-4497-8003-8c9aaaf37806"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.477857 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-db-sync-config-data\") pod \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.478015 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-combined-ca-bundle\") pod \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.478375 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-config-data\") pod \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.478406 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9znw8\" (UniqueName: \"kubernetes.io/projected/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-kube-api-access-9znw8\") pod \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\" (UID: \"123dc36c-92d1-4c98-9c02-ed2c4fbbff27\") " Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.478977 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.478996 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dz7x\" (UniqueName: \"kubernetes.io/projected/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-kube-api-access-4dz7x\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.479025 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca7e9a2b-6e64-4497-8003-8c9aaaf37806-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.481154 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "123dc36c-92d1-4c98-9c02-ed2c4fbbff27" (UID: "123dc36c-92d1-4c98-9c02-ed2c4fbbff27"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.483419 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-kube-api-access-9znw8" (OuterVolumeSpecName: "kube-api-access-9znw8") pod "123dc36c-92d1-4c98-9c02-ed2c4fbbff27" (UID: "123dc36c-92d1-4c98-9c02-ed2c4fbbff27"). InnerVolumeSpecName "kube-api-access-9znw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.504698 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "123dc36c-92d1-4c98-9c02-ed2c4fbbff27" (UID: "123dc36c-92d1-4c98-9c02-ed2c4fbbff27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.529391 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-config-data" (OuterVolumeSpecName: "config-data") pod "123dc36c-92d1-4c98-9c02-ed2c4fbbff27" (UID: "123dc36c-92d1-4c98-9c02-ed2c4fbbff27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.580640 4807 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.580675 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.580689 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.580702 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9znw8\" (UniqueName: \"kubernetes.io/projected/123dc36c-92d1-4c98-9c02-ed2c4fbbff27-kube-api-access-9znw8\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.931179 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-67wm7" event={"ID":"ca7e9a2b-6e64-4497-8003-8c9aaaf37806","Type":"ContainerDied","Data":"1ad57670c2abcff6f87b132d3814973f1996d4f53c5b36262cc9a4630b637c92"} Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.931217 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ad57670c2abcff6f87b132d3814973f1996d4f53c5b36262cc9a4630b637c92" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.931289 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-67wm7" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.934191 4807 generic.go:334] "Generic (PLEG): container finished" podID="d8e8994e-3b30-4187-b13a-21258a2e8c25" containerID="ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c" exitCode=0 Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.934374 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" event={"ID":"d8e8994e-3b30-4187-b13a-21258a2e8c25","Type":"ContainerDied","Data":"ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c"} Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.934420 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" event={"ID":"d8e8994e-3b30-4187-b13a-21258a2e8c25","Type":"ContainerStarted","Data":"c4eae0883126e6f02cd5b55cb992f4a9445890e4819894fb39d40c05af9d96eb"} Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.941037 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xnw86" event={"ID":"123dc36c-92d1-4c98-9c02-ed2c4fbbff27","Type":"ContainerDied","Data":"cd451455087df42ae52978fea10347b613b445747dc6b32c9be8076dcd081932"} Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.941076 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd451455087df42ae52978fea10347b613b445747dc6b32c9be8076dcd081932" Nov 27 11:26:18 crc kubenswrapper[4807]: I1127 11:26:18.941130 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xnw86" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.168215 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-xzmxr"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.192319 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fg767"] Nov 27 11:26:19 crc kubenswrapper[4807]: E1127 11:26:19.192627 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123dc36c-92d1-4c98-9c02-ed2c4fbbff27" containerName="glance-db-sync" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.192638 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="123dc36c-92d1-4c98-9c02-ed2c4fbbff27" containerName="glance-db-sync" Nov 27 11:26:19 crc kubenswrapper[4807]: E1127 11:26:19.192683 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca7e9a2b-6e64-4497-8003-8c9aaaf37806" containerName="keystone-db-sync" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.192691 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca7e9a2b-6e64-4497-8003-8c9aaaf37806" containerName="keystone-db-sync" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.192876 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="123dc36c-92d1-4c98-9c02-ed2c4fbbff27" containerName="glance-db-sync" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.192894 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca7e9a2b-6e64-4497-8003-8c9aaaf37806" containerName="keystone-db-sync" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.193411 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.198487 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-g5j4r"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.199832 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.200068 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.200568 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.200631 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.200765 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-msqwq" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.201044 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.229293 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fg767"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.279625 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-g5j4r"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.293821 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-combined-ca-bundle\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.293862 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwbq8\" (UniqueName: \"kubernetes.io/projected/48274d60-034d-4718-8126-594ffefd281e-kube-api-access-bwbq8\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.293890 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-fernet-keys\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.293911 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-config-data\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.293953 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-scripts\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.294012 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-credential-keys\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.406825 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f6577b49-lkc4r"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.412769 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-credential-keys\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.412924 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-config\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.412972 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-combined-ca-bundle\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.412992 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwbq8\" (UniqueName: \"kubernetes.io/projected/48274d60-034d-4718-8126-594ffefd281e-kube-api-access-bwbq8\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.413076 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-fernet-keys\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.413152 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-config-data\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.413448 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-svc\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.414101 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.429390 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-fernet-keys\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.436187 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-combined-ca-bundle\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.437666 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-config-data\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.442718 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-credential-keys\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.448317 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-scripts\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.462834 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.463164 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp4qf\" (UniqueName: \"kubernetes.io/projected/fa487e14-332e-40e2-84b1-aeee891cc296-kube-api-access-rp4qf\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.463303 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.474410 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f6577b49-lkc4r"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.474525 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.476492 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.476541 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwbq8\" (UniqueName: \"kubernetes.io/projected/48274d60-034d-4718-8126-594ffefd281e-kube-api-access-bwbq8\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.477032 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.477092 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-hmgjn" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.477271 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.477924 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-scripts\") pod \"keystone-bootstrap-fg767\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.529639 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-5w7sj"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.531614 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.555801 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-kc8dv" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.556083 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.556278 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.556425 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-msqwq" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.557666 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570379 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n99f6\" (UniqueName: \"kubernetes.io/projected/c31ec5c2-fceb-48fd-a060-88185742b123-kube-api-access-n99f6\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570437 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-svc\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570470 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570506 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570537 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp4qf\" (UniqueName: \"kubernetes.io/projected/fa487e14-332e-40e2-84b1-aeee891cc296-kube-api-access-rp4qf\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570598 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-scripts\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570616 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570634 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-config-data\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570667 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c31ec5c2-fceb-48fd-a060-88185742b123-horizon-secret-key\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570706 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-config\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.570735 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c31ec5c2-fceb-48fd-a060-88185742b123-logs\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.576515 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.577944 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-svc\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.602122 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5w7sj"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.602158 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-whdsr"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.603172 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.619267 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.622388 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.622503 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.622899 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-config\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.619556 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rmb2j" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.632460 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-whdsr"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.638472 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.654995 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp4qf\" (UniqueName: \"kubernetes.io/projected/fa487e14-332e-40e2-84b1-aeee891cc296-kube-api-access-rp4qf\") pod \"dnsmasq-dns-5959f8865f-g5j4r\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.681474 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.683548 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.687121 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.687490 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.696300 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702371 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n99f6\" (UniqueName: \"kubernetes.io/projected/c31ec5c2-fceb-48fd-a060-88185742b123-kube-api-access-n99f6\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702454 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtldw\" (UniqueName: \"kubernetes.io/projected/1b745997-2256-496c-acee-f804c263ec35-kube-api-access-rtldw\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702498 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-scripts\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702584 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-db-sync-config-data\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702614 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b745997-2256-496c-acee-f804c263ec35-etc-machine-id\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702672 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-combined-ca-bundle\") pod \"neutron-db-sync-5w7sj\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702696 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-config\") pod \"neutron-db-sync-5w7sj\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702736 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-combined-ca-bundle\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702799 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-scripts\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702823 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-config-data\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702880 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-config-data\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702898 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c31ec5c2-fceb-48fd-a060-88185742b123-horizon-secret-key\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.702969 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c31ec5c2-fceb-48fd-a060-88185742b123-logs\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.703019 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7xbf\" (UniqueName: \"kubernetes.io/projected/fda15506-8b54-4097-ba8e-55dabab3ede7-kube-api-access-s7xbf\") pod \"neutron-db-sync-5w7sj\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.704514 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-scripts\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.705324 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-config-data\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.705687 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c31ec5c2-fceb-48fd-a060-88185742b123-logs\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.709465 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.730768 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-g5j4r"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.731404 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.738292 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c31ec5c2-fceb-48fd-a060-88185742b123-horizon-secret-key\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.766477 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-rbw47"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.767081 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n99f6\" (UniqueName: \"kubernetes.io/projected/c31ec5c2-fceb-48fd-a060-88185742b123-kube-api-access-n99f6\") pod \"horizon-f6577b49-lkc4r\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.767499 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.773651 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.774350 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rbw47"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.778537 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-867f8" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.804660 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-scripts\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.804890 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7xbf\" (UniqueName: \"kubernetes.io/projected/fda15506-8b54-4097-ba8e-55dabab3ede7-kube-api-access-s7xbf\") pod \"neutron-db-sync-5w7sj\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.804920 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtldw\" (UniqueName: \"kubernetes.io/projected/1b745997-2256-496c-acee-f804c263ec35-kube-api-access-rtldw\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.804941 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-scripts\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.804978 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-db-sync-config-data\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805000 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b745997-2256-496c-acee-f804c263ec35-etc-machine-id\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805029 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-combined-ca-bundle\") pod \"neutron-db-sync-5w7sj\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805044 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-config\") pod \"neutron-db-sync-5w7sj\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805068 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-combined-ca-bundle\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805096 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjf4f\" (UniqueName: \"kubernetes.io/projected/7a3d55df-3e92-4cb5-aedd-7589b72d5471-kube-api-access-rjf4f\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805111 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-config-data\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805129 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-log-httpd\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805143 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-run-httpd\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805162 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-config-data\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805182 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.805203 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.811143 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b745997-2256-496c-acee-f804c263ec35-etc-machine-id\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.814803 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-combined-ca-bundle\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.816188 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.817847 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.822164 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-scripts\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.822496 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-config-data\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.832813 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.839385 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-combined-ca-bundle\") pod \"neutron-db-sync-5w7sj\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.842654 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-db-sync-config-data\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.844239 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-config\") pod \"neutron-db-sync-5w7sj\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.851805 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.852843 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b87v8" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.858139 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-rxznz"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.862737 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.865894 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtldw\" (UniqueName: \"kubernetes.io/projected/1b745997-2256-496c-acee-f804c263ec35-kube-api-access-rtldw\") pod \"cinder-db-sync-whdsr\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.870227 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7xbf\" (UniqueName: \"kubernetes.io/projected/fda15506-8b54-4097-ba8e-55dabab3ede7-kube-api-access-s7xbf\") pod \"neutron-db-sync-5w7sj\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.883314 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-sgwqm"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.884455 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.886306 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-sgwqm"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.898393 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.899012 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-hmgjn" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.900006 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.904587 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906077 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-logs\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906119 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjf4f\" (UniqueName: \"kubernetes.io/projected/7a3d55df-3e92-4cb5-aedd-7589b72d5471-kube-api-access-rjf4f\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906138 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-config-data\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906158 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-log-httpd\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906171 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-run-httpd\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906201 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906220 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906257 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-scripts\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906276 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906354 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906371 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906398 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr77d\" (UniqueName: \"kubernetes.io/projected/b2df3b54-f71f-469f-92e5-8c1daeb90a45-kube-api-access-xr77d\") pod \"barbican-db-sync-rbw47\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906433 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-combined-ca-bundle\") pod \"barbican-db-sync-rbw47\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906461 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906478 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-db-sync-config-data\") pod \"barbican-db-sync-rbw47\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906493 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.906533 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lh6f\" (UniqueName: \"kubernetes.io/projected/43ee107e-ce97-4aac-9d93-e56844eb58f7-kube-api-access-5lh6f\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.908233 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-log-httpd\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.909152 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-run-httpd\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.918853 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-rxznz"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.920757 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-lwbpk" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.928806 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.949570 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6db58799f7-fkmzb"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.957018 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.984890 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.985267 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-config-data\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.985759 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6db58799f7-fkmzb"] Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.986142 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-scripts\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:19 crc kubenswrapper[4807]: I1127 11:26:19.994675 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.006156 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" event={"ID":"d8e8994e-3b30-4187-b13a-21258a2e8c25","Type":"ContainerStarted","Data":"4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4"} Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.006382 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" podUID="d8e8994e-3b30-4187-b13a-21258a2e8c25" containerName="dnsmasq-dns" containerID="cri-o://4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4" gracePeriod=10 Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.006500 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.018075 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjf4f\" (UniqueName: \"kubernetes.io/projected/7a3d55df-3e92-4cb5-aedd-7589b72d5471-kube-api-access-rjf4f\") pod \"ceilometer-0\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " pod="openstack/ceilometer-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.063574 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-config-data\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.070173 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-combined-ca-bundle\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.066054 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.071836 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f65bbc0-75a0-4294-9cf8-0023799a1fea-logs\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075009 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075105 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075210 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-config\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075298 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da9ed693-7637-404d-8dd9-e849e11b4d43-horizon-secret-key\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075415 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr77d\" (UniqueName: \"kubernetes.io/projected/b2df3b54-f71f-469f-92e5-8c1daeb90a45-kube-api-access-xr77d\") pod \"barbican-db-sync-rbw47\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075509 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-combined-ca-bundle\") pod \"barbican-db-sync-rbw47\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075596 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075668 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-db-sync-config-data\") pod \"barbican-db-sync-rbw47\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075734 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075805 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-svc\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075882 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6j5n\" (UniqueName: \"kubernetes.io/projected/0f65bbc0-75a0-4294-9cf8-0023799a1fea-kube-api-access-v6j5n\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.075960 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da9ed693-7637-404d-8dd9-e849e11b4d43-logs\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.076045 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vjlh\" (UniqueName: \"kubernetes.io/projected/da9ed693-7637-404d-8dd9-e849e11b4d43-kube-api-access-6vjlh\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.076722 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-config-data\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.076816 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lh6f\" (UniqueName: \"kubernetes.io/projected/43ee107e-ce97-4aac-9d93-e56844eb58f7-kube-api-access-5lh6f\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.076914 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.076992 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-logs\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.077153 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.077229 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-scripts\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.077342 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.077438 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-scripts\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.077516 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf5hb\" (UniqueName: \"kubernetes.io/projected/42abfcef-7f24-4754-ac86-b4014209e1ee-kube-api-access-jf5hb\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.077606 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.077982 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.094190 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.094861 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-prxjj"] Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.095458 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-logs\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.098030 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.125094 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-db-sync-config-data\") pod \"barbican-db-sync-rbw47\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.164765 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-whdsr" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.195029 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.195681 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.196957 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-combined-ca-bundle\") pod \"barbican-db-sync-rbw47\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200372 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200404 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-scripts\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200432 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200467 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-scripts\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200493 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf5hb\" (UniqueName: \"kubernetes.io/projected/42abfcef-7f24-4754-ac86-b4014209e1ee-kube-api-access-jf5hb\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200584 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-config-data\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200607 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-combined-ca-bundle\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200628 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f65bbc0-75a0-4294-9cf8-0023799a1fea-logs\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200687 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-config\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200700 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da9ed693-7637-404d-8dd9-e849e11b4d43-horizon-secret-key\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200788 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-svc\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200807 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6j5n\" (UniqueName: \"kubernetes.io/projected/0f65bbc0-75a0-4294-9cf8-0023799a1fea-kube-api-access-v6j5n\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200828 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da9ed693-7637-404d-8dd9-e849e11b4d43-logs\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200858 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vjlh\" (UniqueName: \"kubernetes.io/projected/da9ed693-7637-404d-8dd9-e849e11b4d43-kube-api-access-6vjlh\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200886 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-config-data\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.200929 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.201783 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: E1127 11:26:20.202127 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-jf5hb ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-847c4cc679-rxznz" podUID="42abfcef-7f24-4754-ac86-b4014209e1ee" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.203112 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.205498 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-config\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.206103 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.206687 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-scripts\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.206929 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lh6f\" (UniqueName: \"kubernetes.io/projected/43ee107e-ce97-4aac-9d93-e56844eb58f7-kube-api-access-5lh6f\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.214318 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da9ed693-7637-404d-8dd9-e849e11b4d43-horizon-secret-key\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.215275 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-svc\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.215333 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f65bbc0-75a0-4294-9cf8-0023799a1fea-logs\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.215807 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da9ed693-7637-404d-8dd9-e849e11b4d43-logs\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.216012 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-config-data\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.217352 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.217625 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.223231 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-combined-ca-bundle\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.226651 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-scripts\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.227497 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-config-data\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.228199 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr77d\" (UniqueName: \"kubernetes.io/projected/b2df3b54-f71f-469f-92e5-8c1daeb90a45-kube-api-access-xr77d\") pod \"barbican-db-sync-rbw47\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.245064 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf5hb\" (UniqueName: \"kubernetes.io/projected/42abfcef-7f24-4754-ac86-b4014209e1ee-kube-api-access-jf5hb\") pod \"dnsmasq-dns-847c4cc679-rxznz\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.250289 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-rxznz"] Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.251227 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rbw47" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.252806 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vjlh\" (UniqueName: \"kubernetes.io/projected/da9ed693-7637-404d-8dd9-e849e11b4d43-kube-api-access-6vjlh\") pod \"horizon-6db58799f7-fkmzb\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.253852 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.274130 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6j5n\" (UniqueName: \"kubernetes.io/projected/0f65bbc0-75a0-4294-9cf8-0023799a1fea-kube-api-access-v6j5n\") pod \"placement-db-sync-sgwqm\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.278695 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.305345 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.305517 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.305540 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.305578 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z766f\" (UniqueName: \"kubernetes.io/projected/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-kube-api-access-z766f\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.305628 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-config\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.305650 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.331742 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-prxjj"] Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.360582 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.399217 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" podStartSLOduration=3.399195409 podStartE2EDuration="3.399195409s" podCreationTimestamp="2025-11-27 11:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:20.254817712 +0000 UTC m=+1021.354315910" watchObservedRunningTime="2025-11-27 11:26:20.399195409 +0000 UTC m=+1021.498693607" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.407546 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-config\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.407582 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.407663 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.407749 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.407775 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.407797 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z766f\" (UniqueName: \"kubernetes.io/projected/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-kube-api-access-z766f\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.409027 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-config\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.415896 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.417382 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.417670 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.418059 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.435330 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z766f\" (UniqueName: \"kubernetes.io/projected/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-kube-api-access-z766f\") pod \"dnsmasq-dns-785d8bcb8c-prxjj\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.471336 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fg767"] Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.541642 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.555751 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.603462 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.778123 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-g5j4r"] Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.827952 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.923934 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f6577b49-lkc4r"] Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.924035 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:26:20 crc kubenswrapper[4807]: I1127 11:26:20.927794 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.022028 4807 generic.go:334] "Generic (PLEG): container finished" podID="d8e8994e-3b30-4187-b13a-21258a2e8c25" containerID="4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4" exitCode=0 Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.022143 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" event={"ID":"d8e8994e-3b30-4187-b13a-21258a2e8c25","Type":"ContainerDied","Data":"4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4"} Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.022170 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" event={"ID":"d8e8994e-3b30-4187-b13a-21258a2e8c25","Type":"ContainerDied","Data":"c4eae0883126e6f02cd5b55cb992f4a9445890e4819894fb39d40c05af9d96eb"} Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.022187 4807 scope.go:117] "RemoveContainer" containerID="4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.022390 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-xzmxr" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.026024 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-swift-storage-0\") pod \"d8e8994e-3b30-4187-b13a-21258a2e8c25\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.026066 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsck7\" (UniqueName: \"kubernetes.io/projected/d8e8994e-3b30-4187-b13a-21258a2e8c25-kube-api-access-wsck7\") pod \"d8e8994e-3b30-4187-b13a-21258a2e8c25\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.026145 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-nb\") pod \"d8e8994e-3b30-4187-b13a-21258a2e8c25\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.026194 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-svc\") pod \"d8e8994e-3b30-4187-b13a-21258a2e8c25\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.026237 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-config\") pod \"d8e8994e-3b30-4187-b13a-21258a2e8c25\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.026303 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-sb\") pod \"d8e8994e-3b30-4187-b13a-21258a2e8c25\" (UID: \"d8e8994e-3b30-4187-b13a-21258a2e8c25\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.037220 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e8994e-3b30-4187-b13a-21258a2e8c25-kube-api-access-wsck7" (OuterVolumeSpecName: "kube-api-access-wsck7") pod "d8e8994e-3b30-4187-b13a-21258a2e8c25" (UID: "d8e8994e-3b30-4187-b13a-21258a2e8c25"). InnerVolumeSpecName "kube-api-access-wsck7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.037348 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fg767" event={"ID":"48274d60-034d-4718-8126-594ffefd281e","Type":"ContainerStarted","Data":"e395946d5b7983decac1d895d83ca3d8f3b7b0b1ee7634702f1bac34d0480104"} Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.037383 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fg767" event={"ID":"48274d60-034d-4718-8126-594ffefd281e","Type":"ContainerStarted","Data":"b6ca89041c5518a3f8dfbe7d76baad5836600bf558e33bed4d5a2176e77bf638"} Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.040625 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f6577b49-lkc4r" event={"ID":"c31ec5c2-fceb-48fd-a060-88185742b123","Type":"ContainerStarted","Data":"4e9b707e872478a3e3de511b93fdf6e46872b19908a9809bbfc0f3cfcfad0c57"} Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.045538 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.045613 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" event={"ID":"fa487e14-332e-40e2-84b1-aeee891cc296","Type":"ContainerStarted","Data":"40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174"} Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.045635 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" event={"ID":"fa487e14-332e-40e2-84b1-aeee891cc296","Type":"ContainerStarted","Data":"8f444d8ad393431c6b90b0c6fb3866dde70e730691f0729e5403cada0ff85c5c"} Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.112731 4807 scope.go:117] "RemoveContainer" containerID="ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.113140 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8e8994e-3b30-4187-b13a-21258a2e8c25" (UID: "d8e8994e-3b30-4187-b13a-21258a2e8c25"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.113758 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d8e8994e-3b30-4187-b13a-21258a2e8c25" (UID: "d8e8994e-3b30-4187-b13a-21258a2e8c25"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.114224 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8e8994e-3b30-4187-b13a-21258a2e8c25" (UID: "d8e8994e-3b30-4187-b13a-21258a2e8c25"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.115417 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fg767" podStartSLOduration=2.115397845 podStartE2EDuration="2.115397845s" podCreationTimestamp="2025-11-27 11:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:21.071166175 +0000 UTC m=+1022.170664373" watchObservedRunningTime="2025-11-27 11:26:21.115397845 +0000 UTC m=+1022.214896043" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.128949 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.128986 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsck7\" (UniqueName: \"kubernetes.io/projected/d8e8994e-3b30-4187-b13a-21258a2e8c25-kube-api-access-wsck7\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.129004 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.129018 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.141848 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:21 crc kubenswrapper[4807]: W1127 11:26:21.142292 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfda15506_8b54_4097_ba8e_55dabab3ede7.slice/crio-9d457037c3d6f125f03b3616a9a2cc4906a98b06c88ad8b89d5064e10b6a9a30 WatchSource:0}: Error finding container 9d457037c3d6f125f03b3616a9a2cc4906a98b06c88ad8b89d5064e10b6a9a30: Status 404 returned error can't find the container with id 9d457037c3d6f125f03b3616a9a2cc4906a98b06c88ad8b89d5064e10b6a9a30 Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.149780 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8e8994e-3b30-4187-b13a-21258a2e8c25" (UID: "d8e8994e-3b30-4187-b13a-21258a2e8c25"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.171137 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-rbw47"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.172307 4807 scope.go:117] "RemoveContainer" containerID="4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4" Nov 27 11:26:21 crc kubenswrapper[4807]: E1127 11:26:21.177506 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4\": container with ID starting with 4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4 not found: ID does not exist" containerID="4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.177545 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4"} err="failed to get container status \"4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4\": rpc error: code = NotFound desc = could not find container \"4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4\": container with ID starting with 4a53654bff9d44a0279c72b00799b40dacdc26ba6046d93ba15b510db32093e4 not found: ID does not exist" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.177569 4807 scope.go:117] "RemoveContainer" containerID="ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c" Nov 27 11:26:21 crc kubenswrapper[4807]: E1127 11:26:21.179391 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c\": container with ID starting with ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c not found: ID does not exist" containerID="ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.179415 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c"} err="failed to get container status \"ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c\": rpc error: code = NotFound desc = could not find container \"ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c\": container with ID starting with ebb3b7f06d304d2ccd62bfc4a5cf07f9b5d2a2944cba3ba7a342adb0e12d6b5c not found: ID does not exist" Nov 27 11:26:21 crc kubenswrapper[4807]: W1127 11:26:21.186992 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a3d55df_3e92_4cb5_aedd_7589b72d5471.slice/crio-b241dac74f182dd7b3b450c1efda750c4107f8a5618fbce34b6c1226c009fe94 WatchSource:0}: Error finding container b241dac74f182dd7b3b450c1efda750c4107f8a5618fbce34b6c1226c009fe94: Status 404 returned error can't find the container with id b241dac74f182dd7b3b450c1efda750c4107f8a5618fbce34b6c1226c009fe94 Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.199703 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-config" (OuterVolumeSpecName: "config") pod "d8e8994e-3b30-4187-b13a-21258a2e8c25" (UID: "d8e8994e-3b30-4187-b13a-21258a2e8c25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.206560 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-5w7sj"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.231962 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.231992 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e8994e-3b30-4187-b13a-21258a2e8c25-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.247390 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6db58799f7-fkmzb"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.296674 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-whdsr"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.309566 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.314721 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:26:21 crc kubenswrapper[4807]: E1127 11:26:21.315150 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e8994e-3b30-4187-b13a-21258a2e8c25" containerName="dnsmasq-dns" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.315163 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e8994e-3b30-4187-b13a-21258a2e8c25" containerName="dnsmasq-dns" Nov 27 11:26:21 crc kubenswrapper[4807]: E1127 11:26:21.315174 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e8994e-3b30-4187-b13a-21258a2e8c25" containerName="init" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.315180 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e8994e-3b30-4187-b13a-21258a2e8c25" containerName="init" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.315346 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e8994e-3b30-4187-b13a-21258a2e8c25" containerName="dnsmasq-dns" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.316274 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.321525 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.342287 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-swift-storage-0\") pod \"42abfcef-7f24-4754-ac86-b4014209e1ee\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.342347 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-sb\") pod \"42abfcef-7f24-4754-ac86-b4014209e1ee\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.342382 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf5hb\" (UniqueName: \"kubernetes.io/projected/42abfcef-7f24-4754-ac86-b4014209e1ee-kube-api-access-jf5hb\") pod \"42abfcef-7f24-4754-ac86-b4014209e1ee\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.342489 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-config\") pod \"42abfcef-7f24-4754-ac86-b4014209e1ee\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.342597 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-nb\") pod \"42abfcef-7f24-4754-ac86-b4014209e1ee\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.342680 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-svc\") pod \"42abfcef-7f24-4754-ac86-b4014209e1ee\" (UID: \"42abfcef-7f24-4754-ac86-b4014209e1ee\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.343045 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42abfcef-7f24-4754-ac86-b4014209e1ee" (UID: "42abfcef-7f24-4754-ac86-b4014209e1ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.343397 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.343692 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "42abfcef-7f24-4754-ac86-b4014209e1ee" (UID: "42abfcef-7f24-4754-ac86-b4014209e1ee"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.343747 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-config" (OuterVolumeSpecName: "config") pod "42abfcef-7f24-4754-ac86-b4014209e1ee" (UID: "42abfcef-7f24-4754-ac86-b4014209e1ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.343959 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "42abfcef-7f24-4754-ac86-b4014209e1ee" (UID: "42abfcef-7f24-4754-ac86-b4014209e1ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.344629 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42abfcef-7f24-4754-ac86-b4014209e1ee" (UID: "42abfcef-7f24-4754-ac86-b4014209e1ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.345193 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.356444 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42abfcef-7f24-4754-ac86-b4014209e1ee-kube-api-access-jf5hb" (OuterVolumeSpecName: "kube-api-access-jf5hb") pod "42abfcef-7f24-4754-ac86-b4014209e1ee" (UID: "42abfcef-7f24-4754-ac86-b4014209e1ee"). InnerVolumeSpecName "kube-api-access-jf5hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.387120 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-sgwqm"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.408053 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.416196 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-xzmxr"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.440732 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-xzmxr"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.444914 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhfdn\" (UniqueName: \"kubernetes.io/projected/c8644cb7-09bb-4783-aef2-e3dd150616d2-kube-api-access-hhfdn\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.444975 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.445003 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.445032 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.445074 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.445099 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.445114 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.445160 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.445170 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.445180 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf5hb\" (UniqueName: \"kubernetes.io/projected/42abfcef-7f24-4754-ac86-b4014209e1ee-kube-api-access-jf5hb\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.445188 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.445196 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42abfcef-7f24-4754-ac86-b4014209e1ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.446364 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-prxjj"] Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.547913 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.547947 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.548005 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhfdn\" (UniqueName: \"kubernetes.io/projected/c8644cb7-09bb-4783-aef2-e3dd150616d2-kube-api-access-hhfdn\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.548041 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.548066 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.548103 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.548145 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.549920 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.550208 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.550509 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.568741 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.573110 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.576939 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e8994e-3b30-4187-b13a-21258a2e8c25" path="/var/lib/kubelet/pods/d8e8994e-3b30-4187-b13a-21258a2e8c25/volumes" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.585628 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.590076 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhfdn\" (UniqueName: \"kubernetes.io/projected/c8644cb7-09bb-4783-aef2-e3dd150616d2-kube-api-access-hhfdn\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.621082 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.648279 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.775005 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.968054 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-sb\") pod \"fa487e14-332e-40e2-84b1-aeee891cc296\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.968439 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-swift-storage-0\") pod \"fa487e14-332e-40e2-84b1-aeee891cc296\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.968473 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-nb\") pod \"fa487e14-332e-40e2-84b1-aeee891cc296\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.968524 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-svc\") pod \"fa487e14-332e-40e2-84b1-aeee891cc296\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.968547 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-config\") pod \"fa487e14-332e-40e2-84b1-aeee891cc296\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.968583 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp4qf\" (UniqueName: \"kubernetes.io/projected/fa487e14-332e-40e2-84b1-aeee891cc296-kube-api-access-rp4qf\") pod \"fa487e14-332e-40e2-84b1-aeee891cc296\" (UID: \"fa487e14-332e-40e2-84b1-aeee891cc296\") " Nov 27 11:26:21 crc kubenswrapper[4807]: I1127 11:26:21.995673 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa487e14-332e-40e2-84b1-aeee891cc296-kube-api-access-rp4qf" (OuterVolumeSpecName: "kube-api-access-rp4qf") pod "fa487e14-332e-40e2-84b1-aeee891cc296" (UID: "fa487e14-332e-40e2-84b1-aeee891cc296"). InnerVolumeSpecName "kube-api-access-rp4qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.041503 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa487e14-332e-40e2-84b1-aeee891cc296" (UID: "fa487e14-332e-40e2-84b1-aeee891cc296"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.050044 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fa487e14-332e-40e2-84b1-aeee891cc296" (UID: "fa487e14-332e-40e2-84b1-aeee891cc296"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.066772 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-config" (OuterVolumeSpecName: "config") pod "fa487e14-332e-40e2-84b1-aeee891cc296" (UID: "fa487e14-332e-40e2-84b1-aeee891cc296"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.075671 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.075699 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.076336 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp4qf\" (UniqueName: \"kubernetes.io/projected/fa487e14-332e-40e2-84b1-aeee891cc296-kube-api-access-rp4qf\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.076391 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.077563 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.084421 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fa487e14-332e-40e2-84b1-aeee891cc296" (UID: "fa487e14-332e-40e2-84b1-aeee891cc296"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.104541 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa487e14-332e-40e2-84b1-aeee891cc296" (UID: "fa487e14-332e-40e2-84b1-aeee891cc296"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.118882 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a3d55df-3e92-4cb5-aedd-7589b72d5471","Type":"ContainerStarted","Data":"b241dac74f182dd7b3b450c1efda750c4107f8a5618fbce34b6c1226c009fe94"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.131970 4807 generic.go:334] "Generic (PLEG): container finished" podID="d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" containerID="27182edd6fc96b73e26502ceee92b8a3adfb372535b97cf74ec4ed32682aa5d7" exitCode=0 Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.132058 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" event={"ID":"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91","Type":"ContainerDied","Data":"27182edd6fc96b73e26502ceee92b8a3adfb372535b97cf74ec4ed32682aa5d7"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.132082 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" event={"ID":"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91","Type":"ContainerStarted","Data":"019bd4fd56f0fa2dd0ad20baab4db2f34393f4a5e385a45c8d14326f47c091a6"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.148692 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6db58799f7-fkmzb"] Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.150264 4807 generic.go:334] "Generic (PLEG): container finished" podID="fa487e14-332e-40e2-84b1-aeee891cc296" containerID="40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174" exitCode=0 Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.150332 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" event={"ID":"fa487e14-332e-40e2-84b1-aeee891cc296","Type":"ContainerDied","Data":"40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.150354 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" event={"ID":"fa487e14-332e-40e2-84b1-aeee891cc296","Type":"ContainerDied","Data":"8f444d8ad393431c6b90b0c6fb3866dde70e730691f0729e5403cada0ff85c5c"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.150368 4807 scope.go:117] "RemoveContainer" containerID="40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.150510 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-g5j4r" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.168548 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84ddb7cbfc-7lt2m"] Nov 27 11:26:22 crc kubenswrapper[4807]: E1127 11:26:22.169176 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa487e14-332e-40e2-84b1-aeee891cc296" containerName="init" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.169190 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa487e14-332e-40e2-84b1-aeee891cc296" containerName="init" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.169758 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa487e14-332e-40e2-84b1-aeee891cc296" containerName="init" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.171520 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.177423 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-logs\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.177458 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-config-data\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.177487 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-horizon-secret-key\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.177632 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-scripts\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.177655 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjs2l\" (UniqueName: \"kubernetes.io/projected/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-kube-api-access-mjs2l\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.177696 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.177707 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa487e14-332e-40e2-84b1-aeee891cc296-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.191223 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84ddb7cbfc-7lt2m"] Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.198364 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"43ee107e-ce97-4aac-9d93-e56844eb58f7","Type":"ContainerStarted","Data":"25101fed9e54531041d06fbf6f112d7ac007815779446475e065b41e8548a837"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.220713 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.228028 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5w7sj" event={"ID":"fda15506-8b54-4097-ba8e-55dabab3ede7","Type":"ContainerStarted","Data":"b2312dad39bfa6284f68ec8d4106417fddde6821195f09606a51db2d99c9906b"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.228080 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5w7sj" event={"ID":"fda15506-8b54-4097-ba8e-55dabab3ede7","Type":"ContainerStarted","Data":"9d457037c3d6f125f03b3616a9a2cc4906a98b06c88ad8b89d5064e10b6a9a30"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.279122 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-config-data\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.279174 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-horizon-secret-key\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.279336 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-scripts\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.279366 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjs2l\" (UniqueName: \"kubernetes.io/projected/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-kube-api-access-mjs2l\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.279398 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-logs\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.279736 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-logs\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.280712 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-config-data\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.288846 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-horizon-secret-key\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.290333 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-scripts\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.303167 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-whdsr" event={"ID":"1b745997-2256-496c-acee-f804c263ec35","Type":"ContainerStarted","Data":"69892bdf7663ff68d6bcc69f52c55d993d33f69e36b9750ab29d86f61f3d7efc"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.326849 4807 scope.go:117] "RemoveContainer" containerID="40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.327657 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjs2l\" (UniqueName: \"kubernetes.io/projected/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-kube-api-access-mjs2l\") pod \"horizon-84ddb7cbfc-7lt2m\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: E1127 11:26:22.335280 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174\": container with ID starting with 40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174 not found: ID does not exist" containerID="40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.335321 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174"} err="failed to get container status \"40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174\": rpc error: code = NotFound desc = could not find container \"40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174\": container with ID starting with 40b723f4fcbf4c86e69e855f2fbc54f87b017a57cc47f4136d4c3aea8b8e2174 not found: ID does not exist" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.364560 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sgwqm" event={"ID":"0f65bbc0-75a0-4294-9cf8-0023799a1fea","Type":"ContainerStarted","Data":"8d67b1bae9cfd5613d009bb602854d4a988fcbeeb484ba3d6f4e4906a0343820"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.395300 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.434385 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rbw47" event={"ID":"b2df3b54-f71f-469f-92e5-8c1daeb90a45","Type":"ContainerStarted","Data":"2576ebbcc36a31162bfcfbb375407818c273598a76d2ef1dec0d1d8da64ae3d7"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.448305 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-g5j4r"] Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.452519 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-g5j4r"] Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.476839 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-5w7sj" podStartSLOduration=3.476821379 podStartE2EDuration="3.476821379s" podCreationTimestamp="2025-11-27 11:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:22.321139512 +0000 UTC m=+1023.420637710" watchObservedRunningTime="2025-11-27 11:26:22.476821379 +0000 UTC m=+1023.576319577" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.514597 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.517433 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6db58799f7-fkmzb" event={"ID":"da9ed693-7637-404d-8dd9-e849e11b4d43","Type":"ContainerStarted","Data":"9144722eccde7552e9ba12b12416b8496753174802bf8f2a7934f0b6f4de002d"} Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.517564 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-rxznz" Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.519848 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.674598 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-rxznz"] Nov 27 11:26:22 crc kubenswrapper[4807]: I1127 11:26:22.681563 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-rxznz"] Nov 27 11:26:23 crc kubenswrapper[4807]: I1127 11:26:23.353782 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84ddb7cbfc-7lt2m"] Nov 27 11:26:23 crc kubenswrapper[4807]: I1127 11:26:23.556705 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42abfcef-7f24-4754-ac86-b4014209e1ee" path="/var/lib/kubelet/pods/42abfcef-7f24-4754-ac86-b4014209e1ee/volumes" Nov 27 11:26:23 crc kubenswrapper[4807]: I1127 11:26:23.557125 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa487e14-332e-40e2-84b1-aeee891cc296" path="/var/lib/kubelet/pods/fa487e14-332e-40e2-84b1-aeee891cc296/volumes" Nov 27 11:26:23 crc kubenswrapper[4807]: I1127 11:26:23.557628 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8644cb7-09bb-4783-aef2-e3dd150616d2","Type":"ContainerStarted","Data":"17199d0a1bb2c955c1bc96ce93bec9c67d05622a76fb09856c4ebedbc06e2c6b"} Nov 27 11:26:23 crc kubenswrapper[4807]: I1127 11:26:23.557657 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84ddb7cbfc-7lt2m" event={"ID":"a0ccd0fd-c32d-40c1-b281-35b2b322faf7","Type":"ContainerStarted","Data":"732ddeea9dd1edc994c275c3541acd981660efcabeae12cc0e59a92242909ebf"} Nov 27 11:26:23 crc kubenswrapper[4807]: I1127 11:26:23.558402 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" event={"ID":"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91","Type":"ContainerStarted","Data":"7c17c739e4f56e868f315b50f5a1413747243af516557bfdf6fbadec1a185e9a"} Nov 27 11:26:23 crc kubenswrapper[4807]: I1127 11:26:23.559139 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:23 crc kubenswrapper[4807]: I1127 11:26:23.565842 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"43ee107e-ce97-4aac-9d93-e56844eb58f7","Type":"ContainerStarted","Data":"0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b"} Nov 27 11:26:23 crc kubenswrapper[4807]: I1127 11:26:23.583445 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" podStartSLOduration=4.583427456 podStartE2EDuration="4.583427456s" podCreationTimestamp="2025-11-27 11:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:23.580497349 +0000 UTC m=+1024.679995547" watchObservedRunningTime="2025-11-27 11:26:23.583427456 +0000 UTC m=+1024.682925654" Nov 27 11:26:24 crc kubenswrapper[4807]: I1127 11:26:24.585868 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"43ee107e-ce97-4aac-9d93-e56844eb58f7","Type":"ContainerStarted","Data":"83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de"} Nov 27 11:26:24 crc kubenswrapper[4807]: I1127 11:26:24.586013 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="43ee107e-ce97-4aac-9d93-e56844eb58f7" containerName="glance-log" containerID="cri-o://0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b" gracePeriod=30 Nov 27 11:26:24 crc kubenswrapper[4807]: I1127 11:26:24.586083 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="43ee107e-ce97-4aac-9d93-e56844eb58f7" containerName="glance-httpd" containerID="cri-o://83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de" gracePeriod=30 Nov 27 11:26:24 crc kubenswrapper[4807]: I1127 11:26:24.610740 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.610720197 podStartE2EDuration="5.610720197s" podCreationTimestamp="2025-11-27 11:26:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:24.60442444 +0000 UTC m=+1025.703922638" watchObservedRunningTime="2025-11-27 11:26:24.610720197 +0000 UTC m=+1025.710218395" Nov 27 11:26:24 crc kubenswrapper[4807]: I1127 11:26:24.613477 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8644cb7-09bb-4783-aef2-e3dd150616d2","Type":"ContainerStarted","Data":"7cb9709b919bee9cd99e40a0058753c87b76d6dcd597e653b6a4c1eac309bf86"} Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.295820 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.356685 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-config-data\") pod \"43ee107e-ce97-4aac-9d93-e56844eb58f7\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.356761 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-scripts\") pod \"43ee107e-ce97-4aac-9d93-e56844eb58f7\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.356808 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-combined-ca-bundle\") pod \"43ee107e-ce97-4aac-9d93-e56844eb58f7\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.356832 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-httpd-run\") pod \"43ee107e-ce97-4aac-9d93-e56844eb58f7\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.356868 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lh6f\" (UniqueName: \"kubernetes.io/projected/43ee107e-ce97-4aac-9d93-e56844eb58f7-kube-api-access-5lh6f\") pod \"43ee107e-ce97-4aac-9d93-e56844eb58f7\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.358614 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "43ee107e-ce97-4aac-9d93-e56844eb58f7" (UID: "43ee107e-ce97-4aac-9d93-e56844eb58f7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.381004 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-scripts" (OuterVolumeSpecName: "scripts") pod "43ee107e-ce97-4aac-9d93-e56844eb58f7" (UID: "43ee107e-ce97-4aac-9d93-e56844eb58f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.381214 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ee107e-ce97-4aac-9d93-e56844eb58f7-kube-api-access-5lh6f" (OuterVolumeSpecName: "kube-api-access-5lh6f") pod "43ee107e-ce97-4aac-9d93-e56844eb58f7" (UID: "43ee107e-ce97-4aac-9d93-e56844eb58f7"). InnerVolumeSpecName "kube-api-access-5lh6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.396196 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43ee107e-ce97-4aac-9d93-e56844eb58f7" (UID: "43ee107e-ce97-4aac-9d93-e56844eb58f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.429681 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-config-data" (OuterVolumeSpecName: "config-data") pod "43ee107e-ce97-4aac-9d93-e56844eb58f7" (UID: "43ee107e-ce97-4aac-9d93-e56844eb58f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.463483 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-logs\") pod \"43ee107e-ce97-4aac-9d93-e56844eb58f7\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.463553 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"43ee107e-ce97-4aac-9d93-e56844eb58f7\" (UID: \"43ee107e-ce97-4aac-9d93-e56844eb58f7\") " Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.463803 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.463813 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.463822 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43ee107e-ce97-4aac-9d93-e56844eb58f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.463832 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.463841 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lh6f\" (UniqueName: \"kubernetes.io/projected/43ee107e-ce97-4aac-9d93-e56844eb58f7-kube-api-access-5lh6f\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.474670 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-logs" (OuterVolumeSpecName: "logs") pod "43ee107e-ce97-4aac-9d93-e56844eb58f7" (UID: "43ee107e-ce97-4aac-9d93-e56844eb58f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.475424 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "43ee107e-ce97-4aac-9d93-e56844eb58f7" (UID: "43ee107e-ce97-4aac-9d93-e56844eb58f7"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.566382 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43ee107e-ce97-4aac-9d93-e56844eb58f7-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.566432 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.631503 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.663449 4807 generic.go:334] "Generic (PLEG): container finished" podID="43ee107e-ce97-4aac-9d93-e56844eb58f7" containerID="83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de" exitCode=0 Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.663496 4807 generic.go:334] "Generic (PLEG): container finished" podID="43ee107e-ce97-4aac-9d93-e56844eb58f7" containerID="0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b" exitCode=143 Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.663533 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.663550 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"43ee107e-ce97-4aac-9d93-e56844eb58f7","Type":"ContainerDied","Data":"83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de"} Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.663586 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"43ee107e-ce97-4aac-9d93-e56844eb58f7","Type":"ContainerDied","Data":"0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b"} Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.663597 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"43ee107e-ce97-4aac-9d93-e56844eb58f7","Type":"ContainerDied","Data":"25101fed9e54531041d06fbf6f112d7ac007815779446475e065b41e8548a837"} Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.663611 4807 scope.go:117] "RemoveContainer" containerID="83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.669953 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.675351 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8644cb7-09bb-4783-aef2-e3dd150616d2","Type":"ContainerStarted","Data":"0ecfeefe74c9d576bd862dea32bece7737d6aba8a800b8a2109be564641c56e9"} Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.675486 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c8644cb7-09bb-4783-aef2-e3dd150616d2" containerName="glance-log" containerID="cri-o://7cb9709b919bee9cd99e40a0058753c87b76d6dcd597e653b6a4c1eac309bf86" gracePeriod=30 Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.675682 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c8644cb7-09bb-4783-aef2-e3dd150616d2" containerName="glance-httpd" containerID="cri-o://0ecfeefe74c9d576bd862dea32bece7737d6aba8a800b8a2109be564641c56e9" gracePeriod=30 Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.682025 4807 generic.go:334] "Generic (PLEG): container finished" podID="48274d60-034d-4718-8126-594ffefd281e" containerID="e395946d5b7983decac1d895d83ca3d8f3b7b0b1ee7634702f1bac34d0480104" exitCode=0 Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.682319 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fg767" event={"ID":"48274d60-034d-4718-8126-594ffefd281e","Type":"ContainerDied","Data":"e395946d5b7983decac1d895d83ca3d8f3b7b0b1ee7634702f1bac34d0480104"} Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.701385 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.710501 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.718033 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.718017572 podStartE2EDuration="5.718017572s" podCreationTimestamp="2025-11-27 11:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:25.712875866 +0000 UTC m=+1026.812374064" watchObservedRunningTime="2025-11-27 11:26:25.718017572 +0000 UTC m=+1026.817515770" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.768093 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:25 crc kubenswrapper[4807]: E1127 11:26:25.768560 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ee107e-ce97-4aac-9d93-e56844eb58f7" containerName="glance-httpd" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.768576 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ee107e-ce97-4aac-9d93-e56844eb58f7" containerName="glance-httpd" Nov 27 11:26:25 crc kubenswrapper[4807]: E1127 11:26:25.768606 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43ee107e-ce97-4aac-9d93-e56844eb58f7" containerName="glance-log" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.768614 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ee107e-ce97-4aac-9d93-e56844eb58f7" containerName="glance-log" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.768811 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ee107e-ce97-4aac-9d93-e56844eb58f7" containerName="glance-log" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.768825 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="43ee107e-ce97-4aac-9d93-e56844eb58f7" containerName="glance-httpd" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.771956 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.777092 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.802020 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.972872 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-logs\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.973275 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.973312 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-config-data\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.973382 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5qcj\" (UniqueName: \"kubernetes.io/projected/8ef7696d-30d7-4785-97ae-0910fa44871c-kube-api-access-c5qcj\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.973435 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.973458 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:25 crc kubenswrapper[4807]: I1127 11:26:25.973533 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-scripts\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.075045 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-logs\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.075087 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.075119 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-config-data\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.075149 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5qcj\" (UniqueName: \"kubernetes.io/projected/8ef7696d-30d7-4785-97ae-0910fa44871c-kube-api-access-c5qcj\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.075180 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.075195 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.075282 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-scripts\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.075521 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-logs\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.075576 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.075692 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.080047 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-scripts\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.090193 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.092342 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-config-data\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.100193 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5qcj\" (UniqueName: \"kubernetes.io/projected/8ef7696d-30d7-4785-97ae-0910fa44871c-kube-api-access-c5qcj\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.109377 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.398314 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.696077 4807 generic.go:334] "Generic (PLEG): container finished" podID="c8644cb7-09bb-4783-aef2-e3dd150616d2" containerID="0ecfeefe74c9d576bd862dea32bece7737d6aba8a800b8a2109be564641c56e9" exitCode=0 Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.696105 4807 generic.go:334] "Generic (PLEG): container finished" podID="c8644cb7-09bb-4783-aef2-e3dd150616d2" containerID="7cb9709b919bee9cd99e40a0058753c87b76d6dcd597e653b6a4c1eac309bf86" exitCode=143 Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.696265 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8644cb7-09bb-4783-aef2-e3dd150616d2","Type":"ContainerDied","Data":"0ecfeefe74c9d576bd862dea32bece7737d6aba8a800b8a2109be564641c56e9"} Nov 27 11:26:26 crc kubenswrapper[4807]: I1127 11:26:26.696290 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8644cb7-09bb-4783-aef2-e3dd150616d2","Type":"ContainerDied","Data":"7cb9709b919bee9cd99e40a0058753c87b76d6dcd597e653b6a4c1eac309bf86"} Nov 27 11:26:27 crc kubenswrapper[4807]: I1127 11:26:27.543050 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ee107e-ce97-4aac-9d93-e56844eb58f7" path="/var/lib/kubelet/pods/43ee107e-ce97-4aac-9d93-e56844eb58f7/volumes" Nov 27 11:26:30 crc kubenswrapper[4807]: I1127 11:26:30.605430 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:26:30 crc kubenswrapper[4807]: I1127 11:26:30.663527 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rtrnf"] Nov 27 11:26:30 crc kubenswrapper[4807]: I1127 11:26:30.663806 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-rtrnf" podUID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerName="dnsmasq-dns" containerID="cri-o://7677a534b698f567bf3cf669b38eca8fd4eee4df46f5f7240e5db151c875f6af" gracePeriod=10 Nov 27 11:26:31 crc kubenswrapper[4807]: I1127 11:26:31.773386 4807 generic.go:334] "Generic (PLEG): container finished" podID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerID="7677a534b698f567bf3cf669b38eca8fd4eee4df46f5f7240e5db151c875f6af" exitCode=0 Nov 27 11:26:31 crc kubenswrapper[4807]: I1127 11:26:31.773437 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rtrnf" event={"ID":"fcecd1ef-65f5-47b8-8d41-55b3da46db65","Type":"ContainerDied","Data":"7677a534b698f567bf3cf669b38eca8fd4eee4df46f5f7240e5db151c875f6af"} Nov 27 11:26:31 crc kubenswrapper[4807]: I1127 11:26:31.818837 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:32 crc kubenswrapper[4807]: I1127 11:26:32.773493 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f6577b49-lkc4r"] Nov 27 11:26:32 crc kubenswrapper[4807]: I1127 11:26:32.816565 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7b8bd6c76d-jg9hp"] Nov 27 11:26:32 crc kubenswrapper[4807]: I1127 11:26:32.818169 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:32 crc kubenswrapper[4807]: I1127 11:26:32.821419 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Nov 27 11:26:32 crc kubenswrapper[4807]: I1127 11:26:32.833738 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b8bd6c76d-jg9hp"] Nov 27 11:26:32 crc kubenswrapper[4807]: I1127 11:26:32.879580 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84ddb7cbfc-7lt2m"] Nov 27 11:26:32 crc kubenswrapper[4807]: I1127 11:26:32.908388 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d69cff6fb-88t5t"] Nov 27 11:26:32 crc kubenswrapper[4807]: I1127 11:26:32.910043 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:32 crc kubenswrapper[4807]: I1127 11:26:32.926831 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d69cff6fb-88t5t"] Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.007502 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-secret-key\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.007555 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-tls-certs\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.007588 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbt6\" (UniqueName: \"kubernetes.io/projected/972db85e-5d7f-4312-b2c1-36f3c4e697d3-kube-api-access-ssbt6\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.008152 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-combined-ca-bundle\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.008180 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-scripts\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.008214 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-config-data\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.008268 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/972db85e-5d7f-4312-b2c1-36f3c4e697d3-logs\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.109627 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-combined-ca-bundle\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.109671 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba9d500c-ec74-4755-924d-8b6160bb51dc-config-data\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.109696 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-scripts\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.109739 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-config-data\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.109763 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2wc5\" (UniqueName: \"kubernetes.io/projected/ba9d500c-ec74-4755-924d-8b6160bb51dc-kube-api-access-h2wc5\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.110378 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/972db85e-5d7f-4312-b2c1-36f3c4e697d3-logs\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.110432 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9d500c-ec74-4755-924d-8b6160bb51dc-horizon-tls-certs\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.110478 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9d500c-ec74-4755-924d-8b6160bb51dc-combined-ca-bundle\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.110503 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9d500c-ec74-4755-924d-8b6160bb51dc-logs\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.110522 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-secret-key\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.110790 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-tls-certs\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.110874 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbt6\" (UniqueName: \"kubernetes.io/projected/972db85e-5d7f-4312-b2c1-36f3c4e697d3-kube-api-access-ssbt6\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.110886 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/972db85e-5d7f-4312-b2c1-36f3c4e697d3-logs\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.110899 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba9d500c-ec74-4755-924d-8b6160bb51dc-horizon-secret-key\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.110983 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba9d500c-ec74-4755-924d-8b6160bb51dc-scripts\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.111769 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-scripts\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.111837 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-config-data\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.125951 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-secret-key\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.126150 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-tls-certs\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.126344 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-combined-ca-bundle\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.132403 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbt6\" (UniqueName: \"kubernetes.io/projected/972db85e-5d7f-4312-b2c1-36f3c4e697d3-kube-api-access-ssbt6\") pod \"horizon-7b8bd6c76d-jg9hp\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.144350 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.212940 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9d500c-ec74-4755-924d-8b6160bb51dc-combined-ca-bundle\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.213431 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9d500c-ec74-4755-924d-8b6160bb51dc-logs\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.213537 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba9d500c-ec74-4755-924d-8b6160bb51dc-horizon-secret-key\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.213580 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba9d500c-ec74-4755-924d-8b6160bb51dc-scripts\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.213612 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba9d500c-ec74-4755-924d-8b6160bb51dc-config-data\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.213701 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2wc5\" (UniqueName: \"kubernetes.io/projected/ba9d500c-ec74-4755-924d-8b6160bb51dc-kube-api-access-h2wc5\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.213786 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9d500c-ec74-4755-924d-8b6160bb51dc-horizon-tls-certs\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.213914 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba9d500c-ec74-4755-924d-8b6160bb51dc-logs\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.214344 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba9d500c-ec74-4755-924d-8b6160bb51dc-scripts\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.216745 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ba9d500c-ec74-4755-924d-8b6160bb51dc-config-data\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.217751 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ba9d500c-ec74-4755-924d-8b6160bb51dc-horizon-secret-key\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.217793 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba9d500c-ec74-4755-924d-8b6160bb51dc-combined-ca-bundle\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.218347 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba9d500c-ec74-4755-924d-8b6160bb51dc-horizon-tls-certs\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.231664 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2wc5\" (UniqueName: \"kubernetes.io/projected/ba9d500c-ec74-4755-924d-8b6160bb51dc-kube-api-access-h2wc5\") pod \"horizon-7d69cff6fb-88t5t\" (UID: \"ba9d500c-ec74-4755-924d-8b6160bb51dc\") " pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.525725 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:26:33 crc kubenswrapper[4807]: I1127 11:26:33.746486 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-rtrnf" podUID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.383834 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.437076 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwbq8\" (UniqueName: \"kubernetes.io/projected/48274d60-034d-4718-8126-594ffefd281e-kube-api-access-bwbq8\") pod \"48274d60-034d-4718-8126-594ffefd281e\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.437161 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-fernet-keys\") pod \"48274d60-034d-4718-8126-594ffefd281e\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.437192 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-config-data\") pod \"48274d60-034d-4718-8126-594ffefd281e\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.437316 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-credential-keys\") pod \"48274d60-034d-4718-8126-594ffefd281e\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.437380 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-combined-ca-bundle\") pod \"48274d60-034d-4718-8126-594ffefd281e\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.437454 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-scripts\") pod \"48274d60-034d-4718-8126-594ffefd281e\" (UID: \"48274d60-034d-4718-8126-594ffefd281e\") " Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.447320 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48274d60-034d-4718-8126-594ffefd281e-kube-api-access-bwbq8" (OuterVolumeSpecName: "kube-api-access-bwbq8") pod "48274d60-034d-4718-8126-594ffefd281e" (UID: "48274d60-034d-4718-8126-594ffefd281e"). InnerVolumeSpecName "kube-api-access-bwbq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.460014 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-scripts" (OuterVolumeSpecName: "scripts") pod "48274d60-034d-4718-8126-594ffefd281e" (UID: "48274d60-034d-4718-8126-594ffefd281e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.460059 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "48274d60-034d-4718-8126-594ffefd281e" (UID: "48274d60-034d-4718-8126-594ffefd281e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.468187 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "48274d60-034d-4718-8126-594ffefd281e" (UID: "48274d60-034d-4718-8126-594ffefd281e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.481268 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48274d60-034d-4718-8126-594ffefd281e" (UID: "48274d60-034d-4718-8126-594ffefd281e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.497869 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-config-data" (OuterVolumeSpecName: "config-data") pod "48274d60-034d-4718-8126-594ffefd281e" (UID: "48274d60-034d-4718-8126-594ffefd281e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.539742 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.539773 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwbq8\" (UniqueName: \"kubernetes.io/projected/48274d60-034d-4718-8126-594ffefd281e-kube-api-access-bwbq8\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.539786 4807 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.539795 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.539804 4807 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.539811 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48274d60-034d-4718-8126-594ffefd281e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.798691 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fg767" event={"ID":"48274d60-034d-4718-8126-594ffefd281e","Type":"ContainerDied","Data":"b6ca89041c5518a3f8dfbe7d76baad5836600bf558e33bed4d5a2176e77bf638"} Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.798731 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6ca89041c5518a3f8dfbe7d76baad5836600bf558e33bed4d5a2176e77bf638" Nov 27 11:26:34 crc kubenswrapper[4807]: I1127 11:26:34.798782 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fg767" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.462747 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fg767"] Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.468567 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fg767"] Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.549954 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48274d60-034d-4718-8126-594ffefd281e" path="/var/lib/kubelet/pods/48274d60-034d-4718-8126-594ffefd281e/volumes" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.564639 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lqqgw"] Nov 27 11:26:35 crc kubenswrapper[4807]: E1127 11:26:35.565017 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48274d60-034d-4718-8126-594ffefd281e" containerName="keystone-bootstrap" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.565030 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="48274d60-034d-4718-8126-594ffefd281e" containerName="keystone-bootstrap" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.565192 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="48274d60-034d-4718-8126-594ffefd281e" containerName="keystone-bootstrap" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.565876 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.569808 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.569829 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.570875 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.571661 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.572746 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-msqwq" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.579551 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lqqgw"] Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.665921 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-scripts\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.666007 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-combined-ca-bundle\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.666105 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9f6\" (UniqueName: \"kubernetes.io/projected/3e03b856-d6ea-40a5-96db-10d788131661-kube-api-access-rs9f6\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.666196 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-fernet-keys\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.666273 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-config-data\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.666302 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-credential-keys\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.771747 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-config-data\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.771910 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-credential-keys\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.773052 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-scripts\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.773119 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-combined-ca-bundle\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.773230 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9f6\" (UniqueName: \"kubernetes.io/projected/3e03b856-d6ea-40a5-96db-10d788131661-kube-api-access-rs9f6\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.773378 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-fernet-keys\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.780279 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-scripts\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.780305 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-credential-keys\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.780395 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-combined-ca-bundle\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.785361 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-fernet-keys\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.788557 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-config-data\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.788977 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9f6\" (UniqueName: \"kubernetes.io/projected/3e03b856-d6ea-40a5-96db-10d788131661-kube-api-access-rs9f6\") pod \"keystone-bootstrap-lqqgw\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:35 crc kubenswrapper[4807]: I1127 11:26:35.891600 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:26:37 crc kubenswrapper[4807]: E1127 11:26:37.598283 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Nov 27 11:26:37 crc kubenswrapper[4807]: E1127 11:26:37.598461 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v6j5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-sgwqm_openstack(0f65bbc0-75a0-4294-9cf8-0023799a1fea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:26:37 crc kubenswrapper[4807]: E1127 11:26:37.600130 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-sgwqm" podUID="0f65bbc0-75a0-4294-9cf8-0023799a1fea" Nov 27 11:26:37 crc kubenswrapper[4807]: E1127 11:26:37.834860 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-sgwqm" podUID="0f65bbc0-75a0-4294-9cf8-0023799a1fea" Nov 27 11:26:39 crc kubenswrapper[4807]: E1127 11:26:39.893362 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 27 11:26:39 crc kubenswrapper[4807]: E1127 11:26:39.893948 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n695hfbh5bfh5d4h5b6hbbh88hb7h677hdfh84h67bh699h6bh66bhd8h684hd5h668h54ch8dhd5h56bh9h645h56fhf5h576hd7h654h549h99q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6vjlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6db58799f7-fkmzb_openstack(da9ed693-7637-404d-8dd9-e849e11b4d43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:26:39 crc kubenswrapper[4807]: E1127 11:26:39.896847 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Nov 27 11:26:39 crc kubenswrapper[4807]: E1127 11:26:39.897036 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh699h5fdh5bfh554h58ch59hbch54chc6hc7h5c7h68fh589h5b7h5b4h699h68dh5c4hc4h698h5dbh644hc5h5c9hbch56bh65fh569h64fh95h6fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n99f6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-f6577b49-lkc4r_openstack(c31ec5c2-fceb-48fd-a060-88185742b123): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:26:39 crc kubenswrapper[4807]: E1127 11:26:39.905586 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-f6577b49-lkc4r" podUID="c31ec5c2-fceb-48fd-a060-88185742b123" Nov 27 11:26:39 crc kubenswrapper[4807]: E1127 11:26:39.905699 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6db58799f7-fkmzb" podUID="da9ed693-7637-404d-8dd9-e849e11b4d43" Nov 27 11:26:43 crc kubenswrapper[4807]: I1127 11:26:43.746685 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-rtrnf" podUID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Nov 27 11:26:43 crc kubenswrapper[4807]: I1127 11:26:43.895899 4807 generic.go:334] "Generic (PLEG): container finished" podID="fda15506-8b54-4097-ba8e-55dabab3ede7" containerID="b2312dad39bfa6284f68ec8d4106417fddde6821195f09606a51db2d99c9906b" exitCode=0 Nov 27 11:26:43 crc kubenswrapper[4807]: I1127 11:26:43.895955 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5w7sj" event={"ID":"fda15506-8b54-4097-ba8e-55dabab3ede7","Type":"ContainerDied","Data":"b2312dad39bfa6284f68ec8d4106417fddde6821195f09606a51db2d99c9906b"} Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.310565 4807 scope.go:117] "RemoveContainer" containerID="0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.472301 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.483985 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.604595 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6hcb\" (UniqueName: \"kubernetes.io/projected/fcecd1ef-65f5-47b8-8d41-55b3da46db65-kube-api-access-l6hcb\") pod \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.604661 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-combined-ca-bundle\") pod \"c8644cb7-09bb-4783-aef2-e3dd150616d2\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.604687 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-nb\") pod \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.604749 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-config-data\") pod \"c8644cb7-09bb-4783-aef2-e3dd150616d2\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.604800 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-config\") pod \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.604822 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-sb\") pod \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.604837 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-scripts\") pod \"c8644cb7-09bb-4783-aef2-e3dd150616d2\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.604880 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-dns-svc\") pod \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\" (UID: \"fcecd1ef-65f5-47b8-8d41-55b3da46db65\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.604919 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c8644cb7-09bb-4783-aef2-e3dd150616d2\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.604941 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-logs\") pod \"c8644cb7-09bb-4783-aef2-e3dd150616d2\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.605005 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-httpd-run\") pod \"c8644cb7-09bb-4783-aef2-e3dd150616d2\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.605036 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhfdn\" (UniqueName: \"kubernetes.io/projected/c8644cb7-09bb-4783-aef2-e3dd150616d2-kube-api-access-hhfdn\") pod \"c8644cb7-09bb-4783-aef2-e3dd150616d2\" (UID: \"c8644cb7-09bb-4783-aef2-e3dd150616d2\") " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.609539 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-logs" (OuterVolumeSpecName: "logs") pod "c8644cb7-09bb-4783-aef2-e3dd150616d2" (UID: "c8644cb7-09bb-4783-aef2-e3dd150616d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.610967 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c8644cb7-09bb-4783-aef2-e3dd150616d2" (UID: "c8644cb7-09bb-4783-aef2-e3dd150616d2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.611286 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-scripts" (OuterVolumeSpecName: "scripts") pod "c8644cb7-09bb-4783-aef2-e3dd150616d2" (UID: "c8644cb7-09bb-4783-aef2-e3dd150616d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.617549 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "c8644cb7-09bb-4783-aef2-e3dd150616d2" (UID: "c8644cb7-09bb-4783-aef2-e3dd150616d2"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.618052 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcecd1ef-65f5-47b8-8d41-55b3da46db65-kube-api-access-l6hcb" (OuterVolumeSpecName: "kube-api-access-l6hcb") pod "fcecd1ef-65f5-47b8-8d41-55b3da46db65" (UID: "fcecd1ef-65f5-47b8-8d41-55b3da46db65"). InnerVolumeSpecName "kube-api-access-l6hcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.632013 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8644cb7-09bb-4783-aef2-e3dd150616d2-kube-api-access-hhfdn" (OuterVolumeSpecName: "kube-api-access-hhfdn") pod "c8644cb7-09bb-4783-aef2-e3dd150616d2" (UID: "c8644cb7-09bb-4783-aef2-e3dd150616d2"). InnerVolumeSpecName "kube-api-access-hhfdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.649404 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8644cb7-09bb-4783-aef2-e3dd150616d2" (UID: "c8644cb7-09bb-4783-aef2-e3dd150616d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.671621 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fcecd1ef-65f5-47b8-8d41-55b3da46db65" (UID: "fcecd1ef-65f5-47b8-8d41-55b3da46db65"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.674169 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fcecd1ef-65f5-47b8-8d41-55b3da46db65" (UID: "fcecd1ef-65f5-47b8-8d41-55b3da46db65"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.680149 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-config" (OuterVolumeSpecName: "config") pod "fcecd1ef-65f5-47b8-8d41-55b3da46db65" (UID: "fcecd1ef-65f5-47b8-8d41-55b3da46db65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.683795 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fcecd1ef-65f5-47b8-8d41-55b3da46db65" (UID: "fcecd1ef-65f5-47b8-8d41-55b3da46db65"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.697912 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-config-data" (OuterVolumeSpecName: "config-data") pod "c8644cb7-09bb-4783-aef2-e3dd150616d2" (UID: "c8644cb7-09bb-4783-aef2-e3dd150616d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.709796 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710475 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710547 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710564 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710615 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710628 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710639 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8644cb7-09bb-4783-aef2-e3dd150616d2-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710651 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhfdn\" (UniqueName: \"kubernetes.io/projected/c8644cb7-09bb-4783-aef2-e3dd150616d2-kube-api-access-hhfdn\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710690 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6hcb\" (UniqueName: \"kubernetes.io/projected/fcecd1ef-65f5-47b8-8d41-55b3da46db65-kube-api-access-l6hcb\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710702 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710712 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fcecd1ef-65f5-47b8-8d41-55b3da46db65-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.710722 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8644cb7-09bb-4783-aef2-e3dd150616d2-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.734014 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.747928 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-rtrnf" podUID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: i/o timeout" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.748046 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.812493 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.947386 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.947376 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8644cb7-09bb-4783-aef2-e3dd150616d2","Type":"ContainerDied","Data":"17199d0a1bb2c955c1bc96ce93bec9c67d05622a76fb09856c4ebedbc06e2c6b"} Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.950288 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-rtrnf" event={"ID":"fcecd1ef-65f5-47b8-8d41-55b3da46db65","Type":"ContainerDied","Data":"6a9616cca75debbe4574a723eab851d9f616620975788cdebc4b96bbd79c6321"} Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.950369 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-rtrnf" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.983527 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:26:48 crc kubenswrapper[4807]: E1127 11:26:48.988165 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Nov 27 11:26:48 crc kubenswrapper[4807]: E1127 11:26:48.988301 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xr77d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-rbw47_openstack(b2df3b54-f71f-469f-92e5-8c1daeb90a45): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:26:48 crc kubenswrapper[4807]: E1127 11:26:48.989618 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-rbw47" podUID="b2df3b54-f71f-469f-92e5-8c1daeb90a45" Nov 27 11:26:48 crc kubenswrapper[4807]: I1127 11:26:48.998800 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.007777 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.010637 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rtrnf"] Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.020014 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.021402 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-rtrnf"] Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.030506 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.032907 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:26:49 crc kubenswrapper[4807]: E1127 11:26:49.033370 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8644cb7-09bb-4783-aef2-e3dd150616d2" containerName="glance-log" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.033394 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8644cb7-09bb-4783-aef2-e3dd150616d2" containerName="glance-log" Nov 27 11:26:49 crc kubenswrapper[4807]: E1127 11:26:49.033411 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8644cb7-09bb-4783-aef2-e3dd150616d2" containerName="glance-httpd" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.033418 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8644cb7-09bb-4783-aef2-e3dd150616d2" containerName="glance-httpd" Nov 27 11:26:49 crc kubenswrapper[4807]: E1127 11:26:49.033446 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerName="init" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.033456 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerName="init" Nov 27 11:26:49 crc kubenswrapper[4807]: E1127 11:26:49.033466 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerName="dnsmasq-dns" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.033473 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerName="dnsmasq-dns" Nov 27 11:26:49 crc kubenswrapper[4807]: E1127 11:26:49.033488 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda15506-8b54-4097-ba8e-55dabab3ede7" containerName="neutron-db-sync" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.033495 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda15506-8b54-4097-ba8e-55dabab3ede7" containerName="neutron-db-sync" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.033691 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" containerName="dnsmasq-dns" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.033710 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="fda15506-8b54-4097-ba8e-55dabab3ede7" containerName="neutron-db-sync" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.033729 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8644cb7-09bb-4783-aef2-e3dd150616d2" containerName="glance-httpd" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.033743 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8644cb7-09bb-4783-aef2-e3dd150616d2" containerName="glance-log" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.034910 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.037090 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.037417 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.044715 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.119446 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c31ec5c2-fceb-48fd-a060-88185742b123-logs\") pod \"c31ec5c2-fceb-48fd-a060-88185742b123\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.119517 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7xbf\" (UniqueName: \"kubernetes.io/projected/fda15506-8b54-4097-ba8e-55dabab3ede7-kube-api-access-s7xbf\") pod \"fda15506-8b54-4097-ba8e-55dabab3ede7\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.119575 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-config\") pod \"fda15506-8b54-4097-ba8e-55dabab3ede7\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.119600 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-config-data\") pod \"c31ec5c2-fceb-48fd-a060-88185742b123\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.119625 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c31ec5c2-fceb-48fd-a060-88185742b123-horizon-secret-key\") pod \"c31ec5c2-fceb-48fd-a060-88185742b123\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.119727 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-combined-ca-bundle\") pod \"fda15506-8b54-4097-ba8e-55dabab3ede7\" (UID: \"fda15506-8b54-4097-ba8e-55dabab3ede7\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.119784 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n99f6\" (UniqueName: \"kubernetes.io/projected/c31ec5c2-fceb-48fd-a060-88185742b123-kube-api-access-n99f6\") pod \"c31ec5c2-fceb-48fd-a060-88185742b123\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.119822 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-scripts\") pod \"c31ec5c2-fceb-48fd-a060-88185742b123\" (UID: \"c31ec5c2-fceb-48fd-a060-88185742b123\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.119973 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c31ec5c2-fceb-48fd-a060-88185742b123-logs" (OuterVolumeSpecName: "logs") pod "c31ec5c2-fceb-48fd-a060-88185742b123" (UID: "c31ec5c2-fceb-48fd-a060-88185742b123"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.120425 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c31ec5c2-fceb-48fd-a060-88185742b123-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.120720 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-config-data" (OuterVolumeSpecName: "config-data") pod "c31ec5c2-fceb-48fd-a060-88185742b123" (UID: "c31ec5c2-fceb-48fd-a060-88185742b123"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.121000 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-scripts" (OuterVolumeSpecName: "scripts") pod "c31ec5c2-fceb-48fd-a060-88185742b123" (UID: "c31ec5c2-fceb-48fd-a060-88185742b123"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.128461 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda15506-8b54-4097-ba8e-55dabab3ede7-kube-api-access-s7xbf" (OuterVolumeSpecName: "kube-api-access-s7xbf") pod "fda15506-8b54-4097-ba8e-55dabab3ede7" (UID: "fda15506-8b54-4097-ba8e-55dabab3ede7"). InnerVolumeSpecName "kube-api-access-s7xbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.128979 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31ec5c2-fceb-48fd-a060-88185742b123-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c31ec5c2-fceb-48fd-a060-88185742b123" (UID: "c31ec5c2-fceb-48fd-a060-88185742b123"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.142704 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31ec5c2-fceb-48fd-a060-88185742b123-kube-api-access-n99f6" (OuterVolumeSpecName: "kube-api-access-n99f6") pod "c31ec5c2-fceb-48fd-a060-88185742b123" (UID: "c31ec5c2-fceb-48fd-a060-88185742b123"). InnerVolumeSpecName "kube-api-access-n99f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.149004 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-config" (OuterVolumeSpecName: "config") pod "fda15506-8b54-4097-ba8e-55dabab3ede7" (UID: "fda15506-8b54-4097-ba8e-55dabab3ede7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.149895 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fda15506-8b54-4097-ba8e-55dabab3ede7" (UID: "fda15506-8b54-4097-ba8e-55dabab3ede7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.221615 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vjlh\" (UniqueName: \"kubernetes.io/projected/da9ed693-7637-404d-8dd9-e849e11b4d43-kube-api-access-6vjlh\") pod \"da9ed693-7637-404d-8dd9-e849e11b4d43\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.221815 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da9ed693-7637-404d-8dd9-e849e11b4d43-horizon-secret-key\") pod \"da9ed693-7637-404d-8dd9-e849e11b4d43\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.221876 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da9ed693-7637-404d-8dd9-e849e11b4d43-logs\") pod \"da9ed693-7637-404d-8dd9-e849e11b4d43\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.221917 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-config-data\") pod \"da9ed693-7637-404d-8dd9-e849e11b4d43\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222008 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-scripts\") pod \"da9ed693-7637-404d-8dd9-e849e11b4d43\" (UID: \"da9ed693-7637-404d-8dd9-e849e11b4d43\") " Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222285 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222289 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da9ed693-7637-404d-8dd9-e849e11b4d43-logs" (OuterVolumeSpecName: "logs") pod "da9ed693-7637-404d-8dd9-e849e11b4d43" (UID: "da9ed693-7637-404d-8dd9-e849e11b4d43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222448 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222497 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222541 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222561 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-scripts" (OuterVolumeSpecName: "scripts") pod "da9ed693-7637-404d-8dd9-e849e11b4d43" (UID: "da9ed693-7637-404d-8dd9-e849e11b4d43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222606 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-config-data" (OuterVolumeSpecName: "config-data") pod "da9ed693-7637-404d-8dd9-e849e11b4d43" (UID: "da9ed693-7637-404d-8dd9-e849e11b4d43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222641 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-logs\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222709 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pd86\" (UniqueName: \"kubernetes.io/projected/9b84df57-c9f1-4b55-ab31-7133a1d0841f-kube-api-access-6pd86\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.222736 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223011 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223094 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223109 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223122 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n99f6\" (UniqueName: \"kubernetes.io/projected/c31ec5c2-fceb-48fd-a060-88185742b123-kube-api-access-n99f6\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223134 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223145 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7xbf\" (UniqueName: \"kubernetes.io/projected/fda15506-8b54-4097-ba8e-55dabab3ede7-kube-api-access-s7xbf\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223155 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fda15506-8b54-4097-ba8e-55dabab3ede7-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223165 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c31ec5c2-fceb-48fd-a060-88185742b123-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223175 4807 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c31ec5c2-fceb-48fd-a060-88185742b123-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223185 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da9ed693-7637-404d-8dd9-e849e11b4d43-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.223195 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da9ed693-7637-404d-8dd9-e849e11b4d43-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.224757 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da9ed693-7637-404d-8dd9-e849e11b4d43-kube-api-access-6vjlh" (OuterVolumeSpecName: "kube-api-access-6vjlh") pod "da9ed693-7637-404d-8dd9-e849e11b4d43" (UID: "da9ed693-7637-404d-8dd9-e849e11b4d43"). InnerVolumeSpecName "kube-api-access-6vjlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.225870 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da9ed693-7637-404d-8dd9-e849e11b4d43-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "da9ed693-7637-404d-8dd9-e849e11b4d43" (UID: "da9ed693-7637-404d-8dd9-e849e11b4d43"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.324738 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pd86\" (UniqueName: \"kubernetes.io/projected/9b84df57-c9f1-4b55-ab31-7133a1d0841f-kube-api-access-6pd86\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.324780 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.324802 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.324838 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.324871 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.324894 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.324925 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.324963 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-logs\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.325033 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vjlh\" (UniqueName: \"kubernetes.io/projected/da9ed693-7637-404d-8dd9-e849e11b4d43-kube-api-access-6vjlh\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.325045 4807 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/da9ed693-7637-404d-8dd9-e849e11b4d43-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.325565 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-logs\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.325916 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.326640 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.328567 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.329116 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.329819 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.330531 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.345488 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pd86\" (UniqueName: \"kubernetes.io/projected/9b84df57-c9f1-4b55-ab31-7133a1d0841f-kube-api-access-6pd86\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.356185 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.543110 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8644cb7-09bb-4783-aef2-e3dd150616d2" path="/var/lib/kubelet/pods/c8644cb7-09bb-4783-aef2-e3dd150616d2/volumes" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.544007 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcecd1ef-65f5-47b8-8d41-55b3da46db65" path="/var/lib/kubelet/pods/fcecd1ef-65f5-47b8-8d41-55b3da46db65/volumes" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.650019 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.957408 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6db58799f7-fkmzb" event={"ID":"da9ed693-7637-404d-8dd9-e849e11b4d43","Type":"ContainerDied","Data":"9144722eccde7552e9ba12b12416b8496753174802bf8f2a7934f0b6f4de002d"} Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.957604 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6db58799f7-fkmzb" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.961611 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f6577b49-lkc4r" event={"ID":"c31ec5c2-fceb-48fd-a060-88185742b123","Type":"ContainerDied","Data":"4e9b707e872478a3e3de511b93fdf6e46872b19908a9809bbfc0f3cfcfad0c57"} Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.961677 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f6577b49-lkc4r" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.963675 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-5w7sj" Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.963677 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-5w7sj" event={"ID":"fda15506-8b54-4097-ba8e-55dabab3ede7","Type":"ContainerDied","Data":"9d457037c3d6f125f03b3616a9a2cc4906a98b06c88ad8b89d5064e10b6a9a30"} Nov 27 11:26:49 crc kubenswrapper[4807]: I1127 11:26:49.964065 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d457037c3d6f125f03b3616a9a2cc4906a98b06c88ad8b89d5064e10b6a9a30" Nov 27 11:26:49 crc kubenswrapper[4807]: E1127 11:26:49.966698 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-rbw47" podUID="b2df3b54-f71f-469f-92e5-8c1daeb90a45" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.022454 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6db58799f7-fkmzb"] Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.028493 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6db58799f7-fkmzb"] Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.053013 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f6577b49-lkc4r"] Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.059553 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f6577b49-lkc4r"] Nov 27 11:26:50 crc kubenswrapper[4807]: E1127 11:26:50.383551 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 27 11:26:50 crc kubenswrapper[4807]: E1127 11:26:50.383885 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rtldw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-whdsr_openstack(1b745997-2256-496c-acee-f804c263ec35): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:26:50 crc kubenswrapper[4807]: E1127 11:26:50.385431 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-whdsr" podUID="1b745997-2256-496c-acee-f804c263ec35" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.389152 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bcpjh"] Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.391080 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.400044 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bcpjh"] Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.449180 4807 scope.go:117] "RemoveContainer" containerID="83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de" Nov 27 11:26:50 crc kubenswrapper[4807]: E1127 11:26:50.462501 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de\": container with ID starting with 83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de not found: ID does not exist" containerID="83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.462536 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de"} err="failed to get container status \"83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de\": rpc error: code = NotFound desc = could not find container \"83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de\": container with ID starting with 83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de not found: ID does not exist" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.462560 4807 scope.go:117] "RemoveContainer" containerID="0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b" Nov 27 11:26:50 crc kubenswrapper[4807]: E1127 11:26:50.467220 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b\": container with ID starting with 0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b not found: ID does not exist" containerID="0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.467282 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b"} err="failed to get container status \"0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b\": rpc error: code = NotFound desc = could not find container \"0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b\": container with ID starting with 0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b not found: ID does not exist" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.467311 4807 scope.go:117] "RemoveContainer" containerID="83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.471414 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de"} err="failed to get container status \"83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de\": rpc error: code = NotFound desc = could not find container \"83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de\": container with ID starting with 83b39dda6f4b9a698fca332dd9d47368003b92f84088108392c3ab783cdf89de not found: ID does not exist" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.471456 4807 scope.go:117] "RemoveContainer" containerID="0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.473798 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b"} err="failed to get container status \"0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b\": rpc error: code = NotFound desc = could not find container \"0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b\": container with ID starting with 0deafbc87daa705031629b5e21001659d0fca45196ccf42b88329cc71f00c24b not found: ID does not exist" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.473836 4807 scope.go:117] "RemoveContainer" containerID="0ecfeefe74c9d576bd862dea32bece7737d6aba8a800b8a2109be564641c56e9" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.481378 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b467c77b4-xkthn"] Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.482785 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.485950 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.486083 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.486208 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-kc8dv" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.486312 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.516573 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b467c77b4-xkthn"] Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.549314 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.549355 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.549525 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.549567 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.549650 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-config\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:50 crc kubenswrapper[4807]: I1127 11:26:50.549683 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6b47\" (UniqueName: \"kubernetes.io/projected/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-kube-api-access-d6b47\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654091 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654146 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654216 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-combined-ca-bundle\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654271 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-config\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654288 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-ovndb-tls-certs\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654307 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654324 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654367 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-config\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654387 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6b47\" (UniqueName: \"kubernetes.io/projected/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-kube-api-access-d6b47\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654408 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcfjz\" (UniqueName: \"kubernetes.io/projected/7d8965e4-cee0-4551-8bd1-f8322e804eef-kube-api-access-wcfjz\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.654428 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-httpd-config\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.655258 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.657660 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.657911 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.658375 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.661696 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-config\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.664339 4807 scope.go:117] "RemoveContainer" containerID="7cb9709b919bee9cd99e40a0058753c87b76d6dcd597e653b6a4c1eac309bf86" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.678441 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6b47\" (UniqueName: \"kubernetes.io/projected/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-kube-api-access-d6b47\") pod \"dnsmasq-dns-55f844cf75-bcpjh\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.745572 4807 scope.go:117] "RemoveContainer" containerID="7677a534b698f567bf3cf669b38eca8fd4eee4df46f5f7240e5db151c875f6af" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.756933 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-combined-ca-bundle\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.757013 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-config\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.757054 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-ovndb-tls-certs\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.757452 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcfjz\" (UniqueName: \"kubernetes.io/projected/7d8965e4-cee0-4551-8bd1-f8322e804eef-kube-api-access-wcfjz\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.757537 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-httpd-config\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.762663 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-config\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.765263 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-ovndb-tls-certs\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.765455 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-combined-ca-bundle\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.766330 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-httpd-config\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.775872 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcfjz\" (UniqueName: \"kubernetes.io/projected/7d8965e4-cee0-4551-8bd1-f8322e804eef-kube-api-access-wcfjz\") pod \"neutron-b467c77b4-xkthn\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.792510 4807 scope.go:117] "RemoveContainer" containerID="03c8dac7e13a41a25540dcc2719f59c8357ca6d74e4872a6390a02c3327452a8" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.804720 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.917932 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.921175 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.921223 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.921293 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.922180 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f949ac50efe1cb33ac8f9f8fad96e486a8238fcb507e2fef2a39dd8e43ee4952"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:50.923065 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://f949ac50efe1cb33ac8f9f8fad96e486a8238fcb507e2fef2a39dd8e43ee4952" gracePeriod=600 Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:51.027970 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84ddb7cbfc-7lt2m" event={"ID":"a0ccd0fd-c32d-40c1-b281-35b2b322faf7","Type":"ContainerStarted","Data":"3f8babf2fcc7df3c6ede7d3b902b0240fcef73912c000b5339a196262faab499"} Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:51.040630 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a3d55df-3e92-4cb5-aedd-7589b72d5471","Type":"ContainerStarted","Data":"5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c"} Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:51.047477 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sgwqm" event={"ID":"0f65bbc0-75a0-4294-9cf8-0023799a1fea","Type":"ContainerStarted","Data":"ddd100747950730a02526bbee588199cf983baa95869b5cbdcc83ac734a1ede8"} Nov 27 11:26:51 crc kubenswrapper[4807]: E1127 11:26:51.048741 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-whdsr" podUID="1b745997-2256-496c-acee-f804c263ec35" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:51.111636 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-sgwqm" podStartSLOduration=2.7337020130000003 podStartE2EDuration="32.11161775s" podCreationTimestamp="2025-11-27 11:26:19 +0000 UTC" firstStartedPulling="2025-11-27 11:26:21.32853258 +0000 UTC m=+1022.428030778" lastFinishedPulling="2025-11-27 11:26:50.706448317 +0000 UTC m=+1051.805946515" observedRunningTime="2025-11-27 11:26:51.106403952 +0000 UTC m=+1052.205902150" watchObservedRunningTime="2025-11-27 11:26:51.11161775 +0000 UTC m=+1052.211115948" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:51.125188 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:51.542291 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c31ec5c2-fceb-48fd-a060-88185742b123" path="/var/lib/kubelet/pods/c31ec5c2-fceb-48fd-a060-88185742b123/volumes" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:51.543549 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da9ed693-7637-404d-8dd9-e849e11b4d43" path="/var/lib/kubelet/pods/da9ed693-7637-404d-8dd9-e849e11b4d43/volumes" Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:51.771097 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d69cff6fb-88t5t"] Nov 27 11:26:51 crc kubenswrapper[4807]: I1127 11:26:51.863514 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lqqgw"] Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.058376 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ef7696d-30d7-4785-97ae-0910fa44871c","Type":"ContainerStarted","Data":"d4eb8b04f37304bade24bc6eaae46130731323e7340097ae93c2f50a4003c1fa"} Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.058408 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ef7696d-30d7-4785-97ae-0910fa44871c","Type":"ContainerStarted","Data":"f69d3b3fa0ec5f6521d2a5826c3d02d95d9d0f509fa86ef059f32793a86256d0"} Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.059726 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d69cff6fb-88t5t" event={"ID":"ba9d500c-ec74-4755-924d-8b6160bb51dc","Type":"ContainerStarted","Data":"5aa4eaf96df0bcb8f0e73f1f28dce3ceaeb8bc275b1193dd90a01daa4532f40c"} Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.064346 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lqqgw" event={"ID":"3e03b856-d6ea-40a5-96db-10d788131661","Type":"ContainerStarted","Data":"24f4c86444c92b9d5962451d7ad79d08083df68b872e46b390d7c970722ce1f6"} Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.074808 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bcpjh"] Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.085878 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84ddb7cbfc-7lt2m" event={"ID":"a0ccd0fd-c32d-40c1-b281-35b2b322faf7","Type":"ContainerStarted","Data":"e5cf7ce9b0cdb612da9ff66a1d8fbc4233e4c7ae33c0216e6a817a6e3a7c29cc"} Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.086104 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84ddb7cbfc-7lt2m" podUID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" containerName="horizon-log" containerID="cri-o://3f8babf2fcc7df3c6ede7d3b902b0240fcef73912c000b5339a196262faab499" gracePeriod=30 Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.086891 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84ddb7cbfc-7lt2m" podUID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" containerName="horizon" containerID="cri-o://e5cf7ce9b0cdb612da9ff66a1d8fbc4233e4c7ae33c0216e6a817a6e3a7c29cc" gracePeriod=30 Nov 27 11:26:52 crc kubenswrapper[4807]: W1127 11:26:52.091065 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce34d4ab_e6ee_43c3_9ad7_c4312bae955e.slice/crio-9fcab10dbcaf8471a64ddfc0f84f9b4b1a7ab3a2843eb2be4874bb0fd19267d4 WatchSource:0}: Error finding container 9fcab10dbcaf8471a64ddfc0f84f9b4b1a7ab3a2843eb2be4874bb0fd19267d4: Status 404 returned error can't find the container with id 9fcab10dbcaf8471a64ddfc0f84f9b4b1a7ab3a2843eb2be4874bb0fd19267d4 Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.096563 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="f949ac50efe1cb33ac8f9f8fad96e486a8238fcb507e2fef2a39dd8e43ee4952" exitCode=0 Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.096600 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"f949ac50efe1cb33ac8f9f8fad96e486a8238fcb507e2fef2a39dd8e43ee4952"} Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.096625 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"039fe769a3645e4feb5875769fb8911726da6d6d91da20c8da03dd3106e1c39e"} Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.096644 4807 scope.go:117] "RemoveContainer" containerID="5c16a271e2f512c2b0b496bddb1e050219d71c041d2a908668448dc5280aeab0" Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.146203 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7b8bd6c76d-jg9hp"] Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.150506 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-84ddb7cbfc-7lt2m" podStartSLOduration=4.53949707 podStartE2EDuration="30.150487296s" podCreationTimestamp="2025-11-27 11:26:22 +0000 UTC" firstStartedPulling="2025-11-27 11:26:23.387531557 +0000 UTC m=+1024.487029755" lastFinishedPulling="2025-11-27 11:26:48.998521783 +0000 UTC m=+1050.098019981" observedRunningTime="2025-11-27 11:26:52.108576338 +0000 UTC m=+1053.208074536" watchObservedRunningTime="2025-11-27 11:26:52.150487296 +0000 UTC m=+1053.249985484" Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.203821 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.515600 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:26:52 crc kubenswrapper[4807]: I1127 11:26:52.735330 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b467c77b4-xkthn"] Nov 27 11:26:52 crc kubenswrapper[4807]: W1127 11:26:52.915977 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d8965e4_cee0_4551_8bd1_f8322e804eef.slice/crio-0232d38e27b74d1502e3e485b365c500c1732859da22264dab00cf51d08ceafb WatchSource:0}: Error finding container 0232d38e27b74d1502e3e485b365c500c1732859da22264dab00cf51d08ceafb: Status 404 returned error can't find the container with id 0232d38e27b74d1502e3e485b365c500c1732859da22264dab00cf51d08ceafb Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.002414 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ff549ff99-zxxvk"] Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.004175 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.009867 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.010087 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.014861 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ff549ff99-zxxvk"] Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.132281 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-httpd-config\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.132545 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-ovndb-tls-certs\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.132691 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-combined-ca-bundle\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.132767 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-public-tls-certs\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.132842 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-config\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.133080 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c55r\" (UniqueName: \"kubernetes.io/projected/311f3fc5-b5ab-4fd9-8146-7442b0b29409-kube-api-access-4c55r\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.133161 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-internal-tls-certs\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.160228 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lqqgw" event={"ID":"3e03b856-d6ea-40a5-96db-10d788131661","Type":"ContainerStarted","Data":"94fbd060b6de61963187fbe3a10974c6b2c95045586e88ea94cef730b8529b67"} Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.162173 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b84df57-c9f1-4b55-ab31-7133a1d0841f","Type":"ContainerStarted","Data":"36488c1345b93132845ea300ca5ccb92cdf660fd57a5523a785674d8644b31cd"} Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.188158 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8bd6c76d-jg9hp" event={"ID":"972db85e-5d7f-4312-b2c1-36f3c4e697d3","Type":"ContainerStarted","Data":"0431edfd02689c766bfba84526203198805f6a64725b59ad7a98d79e2b6095d8"} Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.188371 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8bd6c76d-jg9hp" event={"ID":"972db85e-5d7f-4312-b2c1-36f3c4e697d3","Type":"ContainerStarted","Data":"0ac0798928a7da5c337222df919f478a70538f3778eb4e26cc5c2ba283aa1a5b"} Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.195362 4807 generic.go:334] "Generic (PLEG): container finished" podID="ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" containerID="c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e" exitCode=0 Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.195573 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" event={"ID":"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e","Type":"ContainerDied","Data":"c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e"} Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.195691 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" event={"ID":"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e","Type":"ContainerStarted","Data":"9fcab10dbcaf8471a64ddfc0f84f9b4b1a7ab3a2843eb2be4874bb0fd19267d4"} Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.196290 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lqqgw" podStartSLOduration=18.196279086 podStartE2EDuration="18.196279086s" podCreationTimestamp="2025-11-27 11:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:53.189167778 +0000 UTC m=+1054.288665986" watchObservedRunningTime="2025-11-27 11:26:53.196279086 +0000 UTC m=+1054.295777284" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.199364 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b467c77b4-xkthn" event={"ID":"7d8965e4-cee0-4551-8bd1-f8322e804eef","Type":"ContainerStarted","Data":"0232d38e27b74d1502e3e485b365c500c1732859da22264dab00cf51d08ceafb"} Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.206865 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d69cff6fb-88t5t" event={"ID":"ba9d500c-ec74-4755-924d-8b6160bb51dc","Type":"ContainerStarted","Data":"93d1870e2e383fddbfbde82fba7e72784216a9c85b1639bbe54b1f48f28d4e17"} Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.218432 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ef7696d-30d7-4785-97ae-0910fa44871c","Type":"ContainerStarted","Data":"ee8467ac459aca147cfcd30968010ab7e3803cf2af7bca5fc6bab848f82e660e"} Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.218607 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8ef7696d-30d7-4785-97ae-0910fa44871c" containerName="glance-log" containerID="cri-o://d4eb8b04f37304bade24bc6eaae46130731323e7340097ae93c2f50a4003c1fa" gracePeriod=30 Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.218736 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8ef7696d-30d7-4785-97ae-0910fa44871c" containerName="glance-httpd" containerID="cri-o://ee8467ac459aca147cfcd30968010ab7e3803cf2af7bca5fc6bab848f82e660e" gracePeriod=30 Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.236369 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-httpd-config\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.236433 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-ovndb-tls-certs\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.236546 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-combined-ca-bundle\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.236573 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-public-tls-certs\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.236594 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-config\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.236626 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c55r\" (UniqueName: \"kubernetes.io/projected/311f3fc5-b5ab-4fd9-8146-7442b0b29409-kube-api-access-4c55r\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.236651 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-internal-tls-certs\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.247935 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-internal-tls-certs\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.249385 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-httpd-config\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.250823 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=28.250800977 podStartE2EDuration="28.250800977s" podCreationTimestamp="2025-11-27 11:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:53.247714126 +0000 UTC m=+1054.347212314" watchObservedRunningTime="2025-11-27 11:26:53.250800977 +0000 UTC m=+1054.350299175" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.251610 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-combined-ca-bundle\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.252453 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-public-tls-certs\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.259657 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-config\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.259807 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/311f3fc5-b5ab-4fd9-8146-7442b0b29409-ovndb-tls-certs\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.265918 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c55r\" (UniqueName: \"kubernetes.io/projected/311f3fc5-b5ab-4fd9-8146-7442b0b29409-kube-api-access-4c55r\") pod \"neutron-ff549ff99-zxxvk\" (UID: \"311f3fc5-b5ab-4fd9-8146-7442b0b29409\") " pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:53 crc kubenswrapper[4807]: I1127 11:26:53.423577 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.286800 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d69cff6fb-88t5t" event={"ID":"ba9d500c-ec74-4755-924d-8b6160bb51dc","Type":"ContainerStarted","Data":"1c6acd20932960875f831bf52cf7a3d97a320a1f89faa80f74dd556bc617b36e"} Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.287268 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ff549ff99-zxxvk"] Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.325590 4807 generic.go:334] "Generic (PLEG): container finished" podID="0f65bbc0-75a0-4294-9cf8-0023799a1fea" containerID="ddd100747950730a02526bbee588199cf983baa95869b5cbdcc83ac734a1ede8" exitCode=0 Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.325686 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sgwqm" event={"ID":"0f65bbc0-75a0-4294-9cf8-0023799a1fea","Type":"ContainerDied","Data":"ddd100747950730a02526bbee588199cf983baa95869b5cbdcc83ac734a1ede8"} Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.342909 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d69cff6fb-88t5t" podStartSLOduration=22.34289599 podStartE2EDuration="22.34289599s" podCreationTimestamp="2025-11-27 11:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:54.341499283 +0000 UTC m=+1055.440997481" watchObservedRunningTime="2025-11-27 11:26:54.34289599 +0000 UTC m=+1055.442394188" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.346549 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b84df57-c9f1-4b55-ab31-7133a1d0841f","Type":"ContainerStarted","Data":"526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023"} Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.393388 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8bd6c76d-jg9hp" event={"ID":"972db85e-5d7f-4312-b2c1-36f3c4e697d3","Type":"ContainerStarted","Data":"20965bc1b41dbc2aaabb0bc9a8457ca68ec138460ec493e4bad74931239240d6"} Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.439629 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" event={"ID":"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e","Type":"ContainerStarted","Data":"04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1"} Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.440623 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.455788 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7b8bd6c76d-jg9hp" podStartSLOduration=22.455766584 podStartE2EDuration="22.455766584s" podCreationTimestamp="2025-11-27 11:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:54.429968122 +0000 UTC m=+1055.529466320" watchObservedRunningTime="2025-11-27 11:26:54.455766584 +0000 UTC m=+1055.555264782" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.486515 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" podStartSLOduration=4.486497737 podStartE2EDuration="4.486497737s" podCreationTimestamp="2025-11-27 11:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:54.473631766 +0000 UTC m=+1055.573129964" watchObservedRunningTime="2025-11-27 11:26:54.486497737 +0000 UTC m=+1055.585995935" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.501375 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b467c77b4-xkthn" event={"ID":"7d8965e4-cee0-4551-8bd1-f8322e804eef","Type":"ContainerStarted","Data":"bc38cb6be7fa696e810dbf7a6fb2e74170135c020b51380d574297c7b79ab57d"} Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.501434 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b467c77b4-xkthn" event={"ID":"7d8965e4-cee0-4551-8bd1-f8322e804eef","Type":"ContainerStarted","Data":"ecad29e8a69528d2c562482c4041bf612ef0b19a9c64d780177334d565f7433c"} Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.502996 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.513522 4807 generic.go:334] "Generic (PLEG): container finished" podID="8ef7696d-30d7-4785-97ae-0910fa44871c" containerID="ee8467ac459aca147cfcd30968010ab7e3803cf2af7bca5fc6bab848f82e660e" exitCode=0 Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.513558 4807 generic.go:334] "Generic (PLEG): container finished" podID="8ef7696d-30d7-4785-97ae-0910fa44871c" containerID="d4eb8b04f37304bade24bc6eaae46130731323e7340097ae93c2f50a4003c1fa" exitCode=143 Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.513625 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ef7696d-30d7-4785-97ae-0910fa44871c","Type":"ContainerDied","Data":"ee8467ac459aca147cfcd30968010ab7e3803cf2af7bca5fc6bab848f82e660e"} Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.513652 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ef7696d-30d7-4785-97ae-0910fa44871c","Type":"ContainerDied","Data":"d4eb8b04f37304bade24bc6eaae46130731323e7340097ae93c2f50a4003c1fa"} Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.531301 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a3d55df-3e92-4cb5-aedd-7589b72d5471","Type":"ContainerStarted","Data":"61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075"} Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.534655 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b467c77b4-xkthn" podStartSLOduration=4.534636919 podStartE2EDuration="4.534636919s" podCreationTimestamp="2025-11-27 11:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:54.5274674 +0000 UTC m=+1055.626965598" watchObservedRunningTime="2025-11-27 11:26:54.534636919 +0000 UTC m=+1055.634135117" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.544305 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.695180 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-logs\") pod \"8ef7696d-30d7-4785-97ae-0910fa44871c\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.695484 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8ef7696d-30d7-4785-97ae-0910fa44871c\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.695606 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-scripts\") pod \"8ef7696d-30d7-4785-97ae-0910fa44871c\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.695643 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-combined-ca-bundle\") pod \"8ef7696d-30d7-4785-97ae-0910fa44871c\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.695692 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5qcj\" (UniqueName: \"kubernetes.io/projected/8ef7696d-30d7-4785-97ae-0910fa44871c-kube-api-access-c5qcj\") pod \"8ef7696d-30d7-4785-97ae-0910fa44871c\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.695875 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-config-data\") pod \"8ef7696d-30d7-4785-97ae-0910fa44871c\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.695905 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-httpd-run\") pod \"8ef7696d-30d7-4785-97ae-0910fa44871c\" (UID: \"8ef7696d-30d7-4785-97ae-0910fa44871c\") " Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.696175 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-logs" (OuterVolumeSpecName: "logs") pod "8ef7696d-30d7-4785-97ae-0910fa44871c" (UID: "8ef7696d-30d7-4785-97ae-0910fa44871c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.696419 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.697057 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8ef7696d-30d7-4785-97ae-0910fa44871c" (UID: "8ef7696d-30d7-4785-97ae-0910fa44871c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.700304 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-scripts" (OuterVolumeSpecName: "scripts") pod "8ef7696d-30d7-4785-97ae-0910fa44871c" (UID: "8ef7696d-30d7-4785-97ae-0910fa44871c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.704181 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ef7696d-30d7-4785-97ae-0910fa44871c-kube-api-access-c5qcj" (OuterVolumeSpecName: "kube-api-access-c5qcj") pod "8ef7696d-30d7-4785-97ae-0910fa44871c" (UID: "8ef7696d-30d7-4785-97ae-0910fa44871c"). InnerVolumeSpecName "kube-api-access-c5qcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.737640 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "8ef7696d-30d7-4785-97ae-0910fa44871c" (UID: "8ef7696d-30d7-4785-97ae-0910fa44871c"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.742772 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ef7696d-30d7-4785-97ae-0910fa44871c" (UID: "8ef7696d-30d7-4785-97ae-0910fa44871c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.781503 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-config-data" (OuterVolumeSpecName: "config-data") pod "8ef7696d-30d7-4785-97ae-0910fa44871c" (UID: "8ef7696d-30d7-4785-97ae-0910fa44871c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.800394 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.800442 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5qcj\" (UniqueName: \"kubernetes.io/projected/8ef7696d-30d7-4785-97ae-0910fa44871c-kube-api-access-c5qcj\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.800458 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.800469 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8ef7696d-30d7-4785-97ae-0910fa44871c-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.800501 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.800509 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ef7696d-30d7-4785-97ae-0910fa44871c-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.843392 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 27 11:26:54 crc kubenswrapper[4807]: I1127 11:26:54.904405 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.550039 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff549ff99-zxxvk" event={"ID":"311f3fc5-b5ab-4fd9-8146-7442b0b29409","Type":"ContainerStarted","Data":"2d4632a1808540901128e8b663eb2d7e8ffa5a6045e62861eef1db2378661ed5"} Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.550075 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff549ff99-zxxvk" event={"ID":"311f3fc5-b5ab-4fd9-8146-7442b0b29409","Type":"ContainerStarted","Data":"94270cf67878bb6760bee5a18500abe07bbcc53c82669ecef6853ed7db280f48"} Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.568795 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.568867 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8ef7696d-30d7-4785-97ae-0910fa44871c","Type":"ContainerDied","Data":"f69d3b3fa0ec5f6521d2a5826c3d02d95d9d0f509fa86ef059f32793a86256d0"} Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.568902 4807 scope.go:117] "RemoveContainer" containerID="ee8467ac459aca147cfcd30968010ab7e3803cf2af7bca5fc6bab848f82e660e" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.631786 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.667640 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.683774 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:55 crc kubenswrapper[4807]: E1127 11:26:55.684504 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef7696d-30d7-4785-97ae-0910fa44871c" containerName="glance-log" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.684521 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef7696d-30d7-4785-97ae-0910fa44871c" containerName="glance-log" Nov 27 11:26:55 crc kubenswrapper[4807]: E1127 11:26:55.684548 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ef7696d-30d7-4785-97ae-0910fa44871c" containerName="glance-httpd" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.684554 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ef7696d-30d7-4785-97ae-0910fa44871c" containerName="glance-httpd" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.684849 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef7696d-30d7-4785-97ae-0910fa44871c" containerName="glance-httpd" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.684875 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ef7696d-30d7-4785-97ae-0910fa44871c" containerName="glance-log" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.687223 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.692603 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.694977 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.725008 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.784436 4807 scope.go:117] "RemoveContainer" containerID="d4eb8b04f37304bade24bc6eaae46130731323e7340097ae93c2f50a4003c1fa" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.828339 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-scripts\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.828424 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.828463 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.828491 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-logs\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.828556 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wv4d\" (UniqueName: \"kubernetes.io/projected/da707f4d-a2cd-426f-b524-874435ef409c-kube-api-access-7wv4d\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.828641 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.828704 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.828733 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-config-data\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.930221 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.930312 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.930333 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-config-data\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.930349 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-scripts\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.930386 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.930409 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.930426 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-logs\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.930471 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wv4d\" (UniqueName: \"kubernetes.io/projected/da707f4d-a2cd-426f-b524-874435ef409c-kube-api-access-7wv4d\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.930719 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.930968 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.931601 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-logs\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.936853 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.938308 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-config-data\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.950329 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-scripts\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.952986 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.962087 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wv4d\" (UniqueName: \"kubernetes.io/projected/da707f4d-a2cd-426f-b524-874435ef409c-kube-api-access-7wv4d\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:55 crc kubenswrapper[4807]: I1127 11:26:55.975582 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " pod="openstack/glance-default-external-api-0" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.010985 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.175481 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.237892 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-scripts\") pod \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.238019 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-combined-ca-bundle\") pod \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.238053 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6j5n\" (UniqueName: \"kubernetes.io/projected/0f65bbc0-75a0-4294-9cf8-0023799a1fea-kube-api-access-v6j5n\") pod \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.238123 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-config-data\") pod \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.238187 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f65bbc0-75a0-4294-9cf8-0023799a1fea-logs\") pod \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\" (UID: \"0f65bbc0-75a0-4294-9cf8-0023799a1fea\") " Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.238908 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f65bbc0-75a0-4294-9cf8-0023799a1fea-logs" (OuterVolumeSpecName: "logs") pod "0f65bbc0-75a0-4294-9cf8-0023799a1fea" (UID: "0f65bbc0-75a0-4294-9cf8-0023799a1fea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.251672 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-scripts" (OuterVolumeSpecName: "scripts") pod "0f65bbc0-75a0-4294-9cf8-0023799a1fea" (UID: "0f65bbc0-75a0-4294-9cf8-0023799a1fea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.266433 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f65bbc0-75a0-4294-9cf8-0023799a1fea-kube-api-access-v6j5n" (OuterVolumeSpecName: "kube-api-access-v6j5n") pod "0f65bbc0-75a0-4294-9cf8-0023799a1fea" (UID: "0f65bbc0-75a0-4294-9cf8-0023799a1fea"). InnerVolumeSpecName "kube-api-access-v6j5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.311007 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-config-data" (OuterVolumeSpecName: "config-data") pod "0f65bbc0-75a0-4294-9cf8-0023799a1fea" (UID: "0f65bbc0-75a0-4294-9cf8-0023799a1fea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.333201 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f65bbc0-75a0-4294-9cf8-0023799a1fea" (UID: "0f65bbc0-75a0-4294-9cf8-0023799a1fea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.342469 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.342511 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6j5n\" (UniqueName: \"kubernetes.io/projected/0f65bbc0-75a0-4294-9cf8-0023799a1fea-kube-api-access-v6j5n\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.342527 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.342540 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f65bbc0-75a0-4294-9cf8-0023799a1fea-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.342550 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f65bbc0-75a0-4294-9cf8-0023799a1fea-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.536111 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5449cc7d8-rpm6t"] Nov 27 11:26:56 crc kubenswrapper[4807]: E1127 11:26:56.538194 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f65bbc0-75a0-4294-9cf8-0023799a1fea" containerName="placement-db-sync" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.538209 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f65bbc0-75a0-4294-9cf8-0023799a1fea" containerName="placement-db-sync" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.538452 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f65bbc0-75a0-4294-9cf8-0023799a1fea" containerName="placement-db-sync" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.539360 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.547166 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.547426 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.557421 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5449cc7d8-rpm6t"] Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.600630 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b84df57-c9f1-4b55-ab31-7133a1d0841f","Type":"ContainerStarted","Data":"001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db"} Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.612893 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff549ff99-zxxvk" event={"ID":"311f3fc5-b5ab-4fd9-8146-7442b0b29409","Type":"ContainerStarted","Data":"dfc5bf7bac5a17be85baf9a93ce0d23c1479c3c83a668d3e3017a1f4605c45ff"} Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.613900 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.629869 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.629851365 podStartE2EDuration="8.629851365s" podCreationTimestamp="2025-11-27 11:26:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:56.618928186 +0000 UTC m=+1057.718426394" watchObservedRunningTime="2025-11-27 11:26:56.629851365 +0000 UTC m=+1057.729349563" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.636566 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-sgwqm" event={"ID":"0f65bbc0-75a0-4294-9cf8-0023799a1fea","Type":"ContainerDied","Data":"8d67b1bae9cfd5613d009bb602854d4a988fcbeeb484ba3d6f4e4906a0343820"} Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.636642 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d67b1bae9cfd5613d009bb602854d4a988fcbeeb484ba3d6f4e4906a0343820" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.636738 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-sgwqm" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.642792 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ff549ff99-zxxvk" podStartSLOduration=4.642771766 podStartE2EDuration="4.642771766s" podCreationTimestamp="2025-11-27 11:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:56.637365963 +0000 UTC m=+1057.736864181" watchObservedRunningTime="2025-11-27 11:26:56.642771766 +0000 UTC m=+1057.742269964" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.648674 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-scripts\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.648748 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-config-data\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.648782 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-combined-ca-bundle\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.648854 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-public-tls-certs\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.648911 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8rn7\" (UniqueName: \"kubernetes.io/projected/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-kube-api-access-d8rn7\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.648974 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-logs\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.649013 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-internal-tls-certs\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.750170 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-scripts\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.750265 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-config-data\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.750294 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-combined-ca-bundle\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.750477 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-public-tls-certs\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.750531 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8rn7\" (UniqueName: \"kubernetes.io/projected/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-kube-api-access-d8rn7\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.750673 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-logs\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.750715 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-internal-tls-certs\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.758575 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-logs\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.770206 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-scripts\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.772881 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-combined-ca-bundle\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.773852 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-config-data\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.774233 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-internal-tls-certs\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.776910 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-public-tls-certs\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.793064 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8rn7\" (UniqueName: \"kubernetes.io/projected/e4fc9fe4-54f1-458b-b2f7-ff20982e3243-kube-api-access-d8rn7\") pod \"placement-5449cc7d8-rpm6t\" (UID: \"e4fc9fe4-54f1-458b-b2f7-ff20982e3243\") " pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.873689 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:56 crc kubenswrapper[4807]: I1127 11:26:56.900626 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:26:57 crc kubenswrapper[4807]: I1127 11:26:57.556040 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ef7696d-30d7-4785-97ae-0910fa44871c" path="/var/lib/kubelet/pods/8ef7696d-30d7-4785-97ae-0910fa44871c/volumes" Nov 27 11:26:57 crc kubenswrapper[4807]: I1127 11:26:57.557493 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5449cc7d8-rpm6t"] Nov 27 11:26:57 crc kubenswrapper[4807]: I1127 11:26:57.718170 4807 generic.go:334] "Generic (PLEG): container finished" podID="3e03b856-d6ea-40a5-96db-10d788131661" containerID="94fbd060b6de61963187fbe3a10974c6b2c95045586e88ea94cef730b8529b67" exitCode=0 Nov 27 11:26:57 crc kubenswrapper[4807]: I1127 11:26:57.718320 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lqqgw" event={"ID":"3e03b856-d6ea-40a5-96db-10d788131661","Type":"ContainerDied","Data":"94fbd060b6de61963187fbe3a10974c6b2c95045586e88ea94cef730b8529b67"} Nov 27 11:26:57 crc kubenswrapper[4807]: I1127 11:26:57.745380 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5449cc7d8-rpm6t" event={"ID":"e4fc9fe4-54f1-458b-b2f7-ff20982e3243","Type":"ContainerStarted","Data":"51ec285a8bbb6c34a07c89114b3f339df11676c27bcb4dca2c7ad8355ca9f2b1"} Nov 27 11:26:57 crc kubenswrapper[4807]: I1127 11:26:57.748018 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"da707f4d-a2cd-426f-b524-874435ef409c","Type":"ContainerStarted","Data":"d3fdbd4879dd3b2a6f4f924d37802339a97014a62f6fa91f650a841abae24e27"} Nov 27 11:26:58 crc kubenswrapper[4807]: I1127 11:26:58.771573 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"da707f4d-a2cd-426f-b524-874435ef409c","Type":"ContainerStarted","Data":"edd43dc250c0190e440bb59c0b21000ef935603af1c0356ed90b85b963aa250e"} Nov 27 11:26:58 crc kubenswrapper[4807]: I1127 11:26:58.773358 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"da707f4d-a2cd-426f-b524-874435ef409c","Type":"ContainerStarted","Data":"ad157d88570ad9ad10dd3f21af3b4681272f81ca4c53bccd1cecec85ce333532"} Nov 27 11:26:58 crc kubenswrapper[4807]: I1127 11:26:58.778709 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5449cc7d8-rpm6t" event={"ID":"e4fc9fe4-54f1-458b-b2f7-ff20982e3243","Type":"ContainerStarted","Data":"23d4cabd43fe63216b080947635f5a79e8a73a11d4ae6f87011b7a1dcfbd11e2"} Nov 27 11:26:58 crc kubenswrapper[4807]: I1127 11:26:58.778743 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:58 crc kubenswrapper[4807]: I1127 11:26:58.778753 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5449cc7d8-rpm6t" event={"ID":"e4fc9fe4-54f1-458b-b2f7-ff20982e3243","Type":"ContainerStarted","Data":"670a3920f9b69b9b1d1b9e9a109a2811d1cb6ae92433bed609f79a36fc07955e"} Nov 27 11:26:58 crc kubenswrapper[4807]: I1127 11:26:58.778862 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:26:58 crc kubenswrapper[4807]: I1127 11:26:58.807196 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.80717415 podStartE2EDuration="3.80717415s" podCreationTimestamp="2025-11-27 11:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:58.8030079 +0000 UTC m=+1059.902506098" watchObservedRunningTime="2025-11-27 11:26:58.80717415 +0000 UTC m=+1059.906672348" Nov 27 11:26:58 crc kubenswrapper[4807]: I1127 11:26:58.835479 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5449cc7d8-rpm6t" podStartSLOduration=2.8354626080000003 podStartE2EDuration="2.835462608s" podCreationTimestamp="2025-11-27 11:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:26:58.831626487 +0000 UTC m=+1059.931124685" watchObservedRunningTime="2025-11-27 11:26:58.835462608 +0000 UTC m=+1059.934960796" Nov 27 11:26:59 crc kubenswrapper[4807]: I1127 11:26:59.650700 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:59 crc kubenswrapper[4807]: I1127 11:26:59.651109 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:59 crc kubenswrapper[4807]: I1127 11:26:59.682799 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:59 crc kubenswrapper[4807]: I1127 11:26:59.695049 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:59 crc kubenswrapper[4807]: I1127 11:26:59.787046 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 11:26:59 crc kubenswrapper[4807]: I1127 11:26:59.787089 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.741112 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.795914 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lqqgw" Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.795910 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lqqgw" event={"ID":"3e03b856-d6ea-40a5-96db-10d788131661","Type":"ContainerDied","Data":"24f4c86444c92b9d5962451d7ad79d08083df68b872e46b390d7c970722ce1f6"} Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.795957 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24f4c86444c92b9d5962451d7ad79d08083df68b872e46b390d7c970722ce1f6" Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.807190 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.888336 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-prxjj"] Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.888617 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" podUID="d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" containerName="dnsmasq-dns" containerID="cri-o://7c17c739e4f56e868f315b50f5a1413747243af516557bfdf6fbadec1a185e9a" gracePeriod=10 Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.953786 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-combined-ca-bundle\") pod \"3e03b856-d6ea-40a5-96db-10d788131661\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.954140 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs9f6\" (UniqueName: \"kubernetes.io/projected/3e03b856-d6ea-40a5-96db-10d788131661-kube-api-access-rs9f6\") pod \"3e03b856-d6ea-40a5-96db-10d788131661\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.954193 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-credential-keys\") pod \"3e03b856-d6ea-40a5-96db-10d788131661\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.954262 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-scripts\") pod \"3e03b856-d6ea-40a5-96db-10d788131661\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.954327 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-config-data\") pod \"3e03b856-d6ea-40a5-96db-10d788131661\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.954363 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-fernet-keys\") pod \"3e03b856-d6ea-40a5-96db-10d788131661\" (UID: \"3e03b856-d6ea-40a5-96db-10d788131661\") " Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.963392 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3e03b856-d6ea-40a5-96db-10d788131661" (UID: "3e03b856-d6ea-40a5-96db-10d788131661"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.966114 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3e03b856-d6ea-40a5-96db-10d788131661" (UID: "3e03b856-d6ea-40a5-96db-10d788131661"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.978366 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-scripts" (OuterVolumeSpecName: "scripts") pod "3e03b856-d6ea-40a5-96db-10d788131661" (UID: "3e03b856-d6ea-40a5-96db-10d788131661"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.982586 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e03b856-d6ea-40a5-96db-10d788131661-kube-api-access-rs9f6" (OuterVolumeSpecName: "kube-api-access-rs9f6") pod "3e03b856-d6ea-40a5-96db-10d788131661" (UID: "3e03b856-d6ea-40a5-96db-10d788131661"). InnerVolumeSpecName "kube-api-access-rs9f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:00 crc kubenswrapper[4807]: I1127 11:27:00.990354 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-config-data" (OuterVolumeSpecName: "config-data") pod "3e03b856-d6ea-40a5-96db-10d788131661" (UID: "3e03b856-d6ea-40a5-96db-10d788131661"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.005787 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e03b856-d6ea-40a5-96db-10d788131661" (UID: "3e03b856-d6ea-40a5-96db-10d788131661"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.056095 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs9f6\" (UniqueName: \"kubernetes.io/projected/3e03b856-d6ea-40a5-96db-10d788131661-kube-api-access-rs9f6\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.056136 4807 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.056150 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.056161 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.056173 4807 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.056184 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e03b856-d6ea-40a5-96db-10d788131661-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.812062 4807 generic.go:334] "Generic (PLEG): container finished" podID="d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" containerID="7c17c739e4f56e868f315b50f5a1413747243af516557bfdf6fbadec1a185e9a" exitCode=0 Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.812103 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" event={"ID":"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91","Type":"ContainerDied","Data":"7c17c739e4f56e868f315b50f5a1413747243af516557bfdf6fbadec1a185e9a"} Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.840031 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-54b9d76d5d-4mfvr"] Nov 27 11:27:01 crc kubenswrapper[4807]: E1127 11:27:01.840457 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e03b856-d6ea-40a5-96db-10d788131661" containerName="keystone-bootstrap" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.840478 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e03b856-d6ea-40a5-96db-10d788131661" containerName="keystone-bootstrap" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.840710 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e03b856-d6ea-40a5-96db-10d788131661" containerName="keystone-bootstrap" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.855480 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.855920 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-54b9d76d5d-4mfvr"] Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.858974 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-msqwq" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.859225 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.859375 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.859486 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.859770 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.859879 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.970686 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-scripts\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.970807 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-combined-ca-bundle\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.970842 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-public-tls-certs\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.970912 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpw28\" (UniqueName: \"kubernetes.io/projected/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-kube-api-access-bpw28\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.970945 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-config-data\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.970974 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-fernet-keys\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.971009 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-internal-tls-certs\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:01 crc kubenswrapper[4807]: I1127 11:27:01.971031 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-credential-keys\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.072643 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-combined-ca-bundle\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.072690 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-public-tls-certs\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.072760 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpw28\" (UniqueName: \"kubernetes.io/projected/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-kube-api-access-bpw28\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.072791 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-config-data\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.072820 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-fernet-keys\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.072860 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-internal-tls-certs\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.072884 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-credential-keys\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.072920 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-scripts\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.078078 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-scripts\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.078729 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-fernet-keys\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.079297 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-credential-keys\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.080073 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-public-tls-certs\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.080689 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-internal-tls-certs\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.081996 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-config-data\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.086458 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-combined-ca-bundle\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.090832 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpw28\" (UniqueName: \"kubernetes.io/projected/bc71ab7b-e861-46eb-ab9e-e45a4aafd76b-kube-api-access-bpw28\") pod \"keystone-54b9d76d5d-4mfvr\" (UID: \"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b\") " pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.235007 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.615836 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.615945 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 11:27:02 crc kubenswrapper[4807]: I1127 11:27:02.616275 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.145132 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.145189 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.147084 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b8bd6c76d-jg9hp" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.526031 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.526344 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.528426 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d69cff6fb-88t5t" podUID="ba9d500c-ec74-4755-924d-8b6160bb51dc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.818475 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.843445 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" event={"ID":"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91","Type":"ContainerDied","Data":"019bd4fd56f0fa2dd0ad20baab4db2f34393f4a5e385a45c8d14326f47c091a6"} Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.843490 4807 scope.go:117] "RemoveContainer" containerID="7c17c739e4f56e868f315b50f5a1413747243af516557bfdf6fbadec1a185e9a" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.843626 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-prxjj" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.899504 4807 scope.go:117] "RemoveContainer" containerID="27182edd6fc96b73e26502ceee92b8a3adfb372535b97cf74ec4ed32682aa5d7" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.915967 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-nb\") pod \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.916019 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-config\") pod \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.916067 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-svc\") pod \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.916137 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-sb\") pod \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.916183 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-swift-storage-0\") pod \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.916305 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z766f\" (UniqueName: \"kubernetes.io/projected/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-kube-api-access-z766f\") pod \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\" (UID: \"d97a4e6c-6a09-47cb-a9e5-f790da2ddb91\") " Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.925118 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-kube-api-access-z766f" (OuterVolumeSpecName: "kube-api-access-z766f") pod "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" (UID: "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91"). InnerVolumeSpecName "kube-api-access-z766f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.978034 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" (UID: "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.988439 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-config" (OuterVolumeSpecName: "config") pod "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" (UID: "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:03 crc kubenswrapper[4807]: I1127 11:27:03.990737 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" (UID: "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.002369 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" (UID: "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.010693 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" (UID: "d97a4e6c-6a09-47cb-a9e5-f790da2ddb91"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.019381 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.019440 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.019451 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.019460 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.019471 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z766f\" (UniqueName: \"kubernetes.io/projected/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-kube-api-access-z766f\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.019479 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.161640 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-54b9d76d5d-4mfvr"] Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.202333 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-prxjj"] Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.217800 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-prxjj"] Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.870042 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-54b9d76d5d-4mfvr" event={"ID":"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b","Type":"ContainerStarted","Data":"3c5ba49266868798335fa1b7210ed5b4444957ddfe5db964160aca03821af566"} Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.870322 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-54b9d76d5d-4mfvr" event={"ID":"bc71ab7b-e861-46eb-ab9e-e45a4aafd76b","Type":"ContainerStarted","Data":"71d7c7c678a3b718ab4eb3978e6e87fd64f49757fe5c5659106dc798ffd66e48"} Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.871722 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.886770 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a3d55df-3e92-4cb5-aedd-7589b72d5471","Type":"ContainerStarted","Data":"22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255"} Nov 27 11:27:04 crc kubenswrapper[4807]: I1127 11:27:04.912717 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-54b9d76d5d-4mfvr" podStartSLOduration=3.912702923 podStartE2EDuration="3.912702923s" podCreationTimestamp="2025-11-27 11:27:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:27:04.90461637 +0000 UTC m=+1066.004114568" watchObservedRunningTime="2025-11-27 11:27:04.912702923 +0000 UTC m=+1066.012201121" Nov 27 11:27:05 crc kubenswrapper[4807]: I1127 11:27:05.547486 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" path="/var/lib/kubelet/pods/d97a4e6c-6a09-47cb-a9e5-f790da2ddb91/volumes" Nov 27 11:27:05 crc kubenswrapper[4807]: I1127 11:27:05.922658 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rbw47" event={"ID":"b2df3b54-f71f-469f-92e5-8c1daeb90a45","Type":"ContainerStarted","Data":"55e04a6c44d7553589965d4f48795e1b0b6153736bf978aec0361b403c94dee2"} Nov 27 11:27:06 crc kubenswrapper[4807]: I1127 11:27:06.012171 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 11:27:06 crc kubenswrapper[4807]: I1127 11:27:06.012281 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 11:27:06 crc kubenswrapper[4807]: I1127 11:27:06.047125 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 11:27:06 crc kubenswrapper[4807]: I1127 11:27:06.090518 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 11:27:06 crc kubenswrapper[4807]: I1127 11:27:06.932926 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-whdsr" event={"ID":"1b745997-2256-496c-acee-f804c263ec35","Type":"ContainerStarted","Data":"372c20533929fa3e737438722668f2ec05f738c37da43875aa72d9d3ec74a23b"} Nov 27 11:27:06 crc kubenswrapper[4807]: I1127 11:27:06.934541 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 11:27:06 crc kubenswrapper[4807]: I1127 11:27:06.934583 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 11:27:06 crc kubenswrapper[4807]: I1127 11:27:06.948834 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-rbw47" podStartSLOduration=4.112087605 podStartE2EDuration="47.948816976s" podCreationTimestamp="2025-11-27 11:26:19 +0000 UTC" firstStartedPulling="2025-11-27 11:26:21.172774462 +0000 UTC m=+1022.272272660" lastFinishedPulling="2025-11-27 11:27:05.009503823 +0000 UTC m=+1066.109002031" observedRunningTime="2025-11-27 11:27:06.948326804 +0000 UTC m=+1068.047825002" watchObservedRunningTime="2025-11-27 11:27:06.948816976 +0000 UTC m=+1068.048315164" Nov 27 11:27:06 crc kubenswrapper[4807]: I1127 11:27:06.969259 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-whdsr" podStartSLOduration=3.901416566 podStartE2EDuration="47.969211186s" podCreationTimestamp="2025-11-27 11:26:19 +0000 UTC" firstStartedPulling="2025-11-27 11:26:21.191392644 +0000 UTC m=+1022.290890842" lastFinishedPulling="2025-11-27 11:27:05.259187264 +0000 UTC m=+1066.358685462" observedRunningTime="2025-11-27 11:27:06.96370346 +0000 UTC m=+1068.063201658" watchObservedRunningTime="2025-11-27 11:27:06.969211186 +0000 UTC m=+1068.068709384" Nov 27 11:27:09 crc kubenswrapper[4807]: I1127 11:27:09.155149 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 11:27:09 crc kubenswrapper[4807]: I1127 11:27:09.155512 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 11:27:09 crc kubenswrapper[4807]: I1127 11:27:09.160630 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 11:27:09 crc kubenswrapper[4807]: I1127 11:27:09.961225 4807 generic.go:334] "Generic (PLEG): container finished" podID="b2df3b54-f71f-469f-92e5-8c1daeb90a45" containerID="55e04a6c44d7553589965d4f48795e1b0b6153736bf978aec0361b403c94dee2" exitCode=0 Nov 27 11:27:09 crc kubenswrapper[4807]: I1127 11:27:09.961324 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rbw47" event={"ID":"b2df3b54-f71f-469f-92e5-8c1daeb90a45","Type":"ContainerDied","Data":"55e04a6c44d7553589965d4f48795e1b0b6153736bf978aec0361b403c94dee2"} Nov 27 11:27:11 crc kubenswrapper[4807]: I1127 11:27:11.984363 4807 generic.go:334] "Generic (PLEG): container finished" podID="1b745997-2256-496c-acee-f804c263ec35" containerID="372c20533929fa3e737438722668f2ec05f738c37da43875aa72d9d3ec74a23b" exitCode=0 Nov 27 11:27:11 crc kubenswrapper[4807]: I1127 11:27:11.984416 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-whdsr" event={"ID":"1b745997-2256-496c-acee-f804c263ec35","Type":"ContainerDied","Data":"372c20533929fa3e737438722668f2ec05f738c37da43875aa72d9d3ec74a23b"} Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.145501 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7b8bd6c76d-jg9hp" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.526994 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d69cff6fb-88t5t" podUID="ba9d500c-ec74-4755-924d-8b6160bb51dc" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.568827 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rbw47" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.578300 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-whdsr" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.667108 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr77d\" (UniqueName: \"kubernetes.io/projected/b2df3b54-f71f-469f-92e5-8c1daeb90a45-kube-api-access-xr77d\") pod \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.667174 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-db-sync-config-data\") pod \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.667316 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-combined-ca-bundle\") pod \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\" (UID: \"b2df3b54-f71f-469f-92e5-8c1daeb90a45\") " Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.671500 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2df3b54-f71f-469f-92e5-8c1daeb90a45-kube-api-access-xr77d" (OuterVolumeSpecName: "kube-api-access-xr77d") pod "b2df3b54-f71f-469f-92e5-8c1daeb90a45" (UID: "b2df3b54-f71f-469f-92e5-8c1daeb90a45"). InnerVolumeSpecName "kube-api-access-xr77d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.671778 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b2df3b54-f71f-469f-92e5-8c1daeb90a45" (UID: "b2df3b54-f71f-469f-92e5-8c1daeb90a45"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.693546 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2df3b54-f71f-469f-92e5-8c1daeb90a45" (UID: "b2df3b54-f71f-469f-92e5-8c1daeb90a45"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.769446 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-combined-ca-bundle\") pod \"1b745997-2256-496c-acee-f804c263ec35\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.769544 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-scripts\") pod \"1b745997-2256-496c-acee-f804c263ec35\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.769623 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-db-sync-config-data\") pod \"1b745997-2256-496c-acee-f804c263ec35\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.769642 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b745997-2256-496c-acee-f804c263ec35-etc-machine-id\") pod \"1b745997-2256-496c-acee-f804c263ec35\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.769678 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-config-data\") pod \"1b745997-2256-496c-acee-f804c263ec35\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.769697 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtldw\" (UniqueName: \"kubernetes.io/projected/1b745997-2256-496c-acee-f804c263ec35-kube-api-access-rtldw\") pod \"1b745997-2256-496c-acee-f804c263ec35\" (UID: \"1b745997-2256-496c-acee-f804c263ec35\") " Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.770063 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr77d\" (UniqueName: \"kubernetes.io/projected/b2df3b54-f71f-469f-92e5-8c1daeb90a45-kube-api-access-xr77d\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.770089 4807 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.770118 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2df3b54-f71f-469f-92e5-8c1daeb90a45-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.770801 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b745997-2256-496c-acee-f804c263ec35-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1b745997-2256-496c-acee-f804c263ec35" (UID: "1b745997-2256-496c-acee-f804c263ec35"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.774536 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b745997-2256-496c-acee-f804c263ec35-kube-api-access-rtldw" (OuterVolumeSpecName: "kube-api-access-rtldw") pod "1b745997-2256-496c-acee-f804c263ec35" (UID: "1b745997-2256-496c-acee-f804c263ec35"). InnerVolumeSpecName "kube-api-access-rtldw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.774590 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-scripts" (OuterVolumeSpecName: "scripts") pod "1b745997-2256-496c-acee-f804c263ec35" (UID: "1b745997-2256-496c-acee-f804c263ec35"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.774617 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1b745997-2256-496c-acee-f804c263ec35" (UID: "1b745997-2256-496c-acee-f804c263ec35"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.797283 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b745997-2256-496c-acee-f804c263ec35" (UID: "1b745997-2256-496c-acee-f804c263ec35"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.822662 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-config-data" (OuterVolumeSpecName: "config-data") pod "1b745997-2256-496c-acee-f804c263ec35" (UID: "1b745997-2256-496c-acee-f804c263ec35"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.872242 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.872303 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.872313 4807 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.872323 4807 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1b745997-2256-496c-acee-f804c263ec35-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.872333 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b745997-2256-496c-acee-f804c263ec35-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:13 crc kubenswrapper[4807]: I1127 11:27:13.872343 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtldw\" (UniqueName: \"kubernetes.io/projected/1b745997-2256-496c-acee-f804c263ec35-kube-api-access-rtldw\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.002013 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-rbw47" event={"ID":"b2df3b54-f71f-469f-92e5-8c1daeb90a45","Type":"ContainerDied","Data":"2576ebbcc36a31162bfcfbb375407818c273598a76d2ef1dec0d1d8da64ae3d7"} Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.002061 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2576ebbcc36a31162bfcfbb375407818c273598a76d2ef1dec0d1d8da64ae3d7" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.002026 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-rbw47" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.003836 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-whdsr" event={"ID":"1b745997-2256-496c-acee-f804c263ec35","Type":"ContainerDied","Data":"69892bdf7663ff68d6bcc69f52c55d993d33f69e36b9750ab29d86f61f3d7efc"} Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.003887 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69892bdf7663ff68d6bcc69f52c55d993d33f69e36b9750ab29d86f61f3d7efc" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.003855 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-whdsr" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.005946 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a3d55df-3e92-4cb5-aedd-7589b72d5471","Type":"ContainerStarted","Data":"b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb"} Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.006088 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="ceilometer-central-agent" containerID="cri-o://5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c" gracePeriod=30 Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.006383 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.006672 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="proxy-httpd" containerID="cri-o://b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb" gracePeriod=30 Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.006733 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="sg-core" containerID="cri-o://22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255" gracePeriod=30 Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.006773 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="ceilometer-notification-agent" containerID="cri-o://61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075" gracePeriod=30 Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.046072 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.663491557 podStartE2EDuration="55.04605308s" podCreationTimestamp="2025-11-27 11:26:19 +0000 UTC" firstStartedPulling="2025-11-27 11:26:21.193763417 +0000 UTC m=+1022.293261615" lastFinishedPulling="2025-11-27 11:27:13.57632494 +0000 UTC m=+1074.675823138" observedRunningTime="2025-11-27 11:27:14.04267217 +0000 UTC m=+1075.142170378" watchObservedRunningTime="2025-11-27 11:27:14.04605308 +0000 UTC m=+1075.145551278" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.307771 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 11:27:14 crc kubenswrapper[4807]: E1127 11:27:14.308156 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" containerName="dnsmasq-dns" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.308169 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" containerName="dnsmasq-dns" Nov 27 11:27:14 crc kubenswrapper[4807]: E1127 11:27:14.308207 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" containerName="init" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.308213 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" containerName="init" Nov 27 11:27:14 crc kubenswrapper[4807]: E1127 11:27:14.308222 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b745997-2256-496c-acee-f804c263ec35" containerName="cinder-db-sync" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.308229 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b745997-2256-496c-acee-f804c263ec35" containerName="cinder-db-sync" Nov 27 11:27:14 crc kubenswrapper[4807]: E1127 11:27:14.308273 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2df3b54-f71f-469f-92e5-8c1daeb90a45" containerName="barbican-db-sync" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.308281 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2df3b54-f71f-469f-92e5-8c1daeb90a45" containerName="barbican-db-sync" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.308456 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b745997-2256-496c-acee-f804c263ec35" containerName="cinder-db-sync" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.308467 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2df3b54-f71f-469f-92e5-8c1daeb90a45" containerName="barbican-db-sync" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.308482 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d97a4e6c-6a09-47cb-a9e5-f790da2ddb91" containerName="dnsmasq-dns" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.309375 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.312829 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.313054 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rmb2j" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.313322 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.314548 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.326098 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.407147 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b895b5785-gj2cp"] Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.417179 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.418751 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-gj2cp"] Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496272 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496315 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496333 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496355 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496377 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496398 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-svc\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496412 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-scripts\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496441 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvqxc\" (UniqueName: \"kubernetes.io/projected/2062d69b-506e-4da5-8152-de320573bf94-kube-api-access-lvqxc\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496477 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9kp\" (UniqueName: \"kubernetes.io/projected/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-kube-api-access-jg9kp\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496498 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496567 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-config\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.496583 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2062d69b-506e-4da5-8152-de320573bf94-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.590861 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.598621 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.601363 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.601996 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-config\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.602029 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2062d69b-506e-4da5-8152-de320573bf94-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.602356 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2062d69b-506e-4da5-8152-de320573bf94-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603147 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-config\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603415 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603464 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603490 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603521 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603553 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603583 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-svc\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603600 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-scripts\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603647 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvqxc\" (UniqueName: \"kubernetes.io/projected/2062d69b-506e-4da5-8152-de320573bf94-kube-api-access-lvqxc\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603744 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9kp\" (UniqueName: \"kubernetes.io/projected/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-kube-api-access-jg9kp\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.603790 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.604135 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.605593 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-svc\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.605734 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.606573 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.610465 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-scripts\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.610777 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.611726 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.612980 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.614501 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.632820 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9kp\" (UniqueName: \"kubernetes.io/projected/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-kube-api-access-jg9kp\") pod \"dnsmasq-dns-b895b5785-gj2cp\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.641890 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvqxc\" (UniqueName: \"kubernetes.io/projected/2062d69b-506e-4da5-8152-de320573bf94-kube-api-access-lvqxc\") pod \"cinder-scheduler-0\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.704567 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.704636 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/390b4557-5aed-4b7a-ab55-6e68fd33fd54-etc-machine-id\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.704664 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/390b4557-5aed-4b7a-ab55-6e68fd33fd54-logs\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.704686 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.704721 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-scripts\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.704737 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9zkc\" (UniqueName: \"kubernetes.io/projected/390b4557-5aed-4b7a-ab55-6e68fd33fd54-kube-api-access-d9zkc\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.704752 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data-custom\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.774735 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:14 crc kubenswrapper[4807]: E1127 11:27:14.795037 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a3d55df_3e92_4cb5_aedd_7589b72d5471.slice/crio-conmon-5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c.scope\": RecentStats: unable to find data in memory cache]" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.810391 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.810482 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/390b4557-5aed-4b7a-ab55-6e68fd33fd54-etc-machine-id\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.810520 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/390b4557-5aed-4b7a-ab55-6e68fd33fd54-logs\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.810557 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.810609 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-scripts\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.810632 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9zkc\" (UniqueName: \"kubernetes.io/projected/390b4557-5aed-4b7a-ab55-6e68fd33fd54-kube-api-access-d9zkc\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.810653 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data-custom\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.835778 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/390b4557-5aed-4b7a-ab55-6e68fd33fd54-logs\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.839308 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/390b4557-5aed-4b7a-ab55-6e68fd33fd54-etc-machine-id\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.839821 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-67945bfd5d-wnmj5"] Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.841568 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.849218 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.854923 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data-custom\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.855837 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.856164 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-scripts\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.856564 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-867f8" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.856790 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.856901 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.858538 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5df9b5c779-cqvbn"] Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.863412 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.881519 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.915465 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b22f0add-3876-4db6-a6ac-83bf95c37ea6-config-data-custom\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.915527 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b22f0add-3876-4db6-a6ac-83bf95c37ea6-config-data\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.915554 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b00dcc-1d4a-4d61-9865-db7b0515e360-config-data\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.915594 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25b00dcc-1d4a-4d61-9865-db7b0515e360-config-data-custom\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.915613 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b22f0add-3876-4db6-a6ac-83bf95c37ea6-logs\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.915642 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b00dcc-1d4a-4d61-9865-db7b0515e360-combined-ca-bundle\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.915686 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsmws\" (UniqueName: \"kubernetes.io/projected/b22f0add-3876-4db6-a6ac-83bf95c37ea6-kube-api-access-bsmws\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.915725 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25b00dcc-1d4a-4d61-9865-db7b0515e360-logs\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.915757 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22f0add-3876-4db6-a6ac-83bf95c37ea6-combined-ca-bundle\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.915783 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drw22\" (UniqueName: \"kubernetes.io/projected/25b00dcc-1d4a-4d61-9865-db7b0515e360-kube-api-access-drw22\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.921095 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9zkc\" (UniqueName: \"kubernetes.io/projected/390b4557-5aed-4b7a-ab55-6e68fd33fd54-kube-api-access-d9zkc\") pod \"cinder-api-0\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " pod="openstack/cinder-api-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.921417 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67945bfd5d-wnmj5"] Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.927009 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.978820 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5df9b5c779-cqvbn"] Nov 27 11:27:14 crc kubenswrapper[4807]: I1127 11:27:14.987636 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.024800 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsmws\" (UniqueName: \"kubernetes.io/projected/b22f0add-3876-4db6-a6ac-83bf95c37ea6-kube-api-access-bsmws\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.024867 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25b00dcc-1d4a-4d61-9865-db7b0515e360-logs\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.024911 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22f0add-3876-4db6-a6ac-83bf95c37ea6-combined-ca-bundle\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.024944 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drw22\" (UniqueName: \"kubernetes.io/projected/25b00dcc-1d4a-4d61-9865-db7b0515e360-kube-api-access-drw22\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.024974 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b22f0add-3876-4db6-a6ac-83bf95c37ea6-config-data-custom\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.025002 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b22f0add-3876-4db6-a6ac-83bf95c37ea6-config-data\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.025028 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b00dcc-1d4a-4d61-9865-db7b0515e360-config-data\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.025067 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25b00dcc-1d4a-4d61-9865-db7b0515e360-config-data-custom\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.025087 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b22f0add-3876-4db6-a6ac-83bf95c37ea6-logs\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.025118 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b00dcc-1d4a-4d61-9865-db7b0515e360-combined-ca-bundle\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.033805 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25b00dcc-1d4a-4d61-9865-db7b0515e360-logs\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.045605 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b22f0add-3876-4db6-a6ac-83bf95c37ea6-logs\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.052336 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b22f0add-3876-4db6-a6ac-83bf95c37ea6-config-data-custom\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.064091 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b00dcc-1d4a-4d61-9865-db7b0515e360-combined-ca-bundle\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.076411 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b00dcc-1d4a-4d61-9865-db7b0515e360-config-data\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.076877 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25b00dcc-1d4a-4d61-9865-db7b0515e360-config-data-custom\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.077755 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22f0add-3876-4db6-a6ac-83bf95c37ea6-combined-ca-bundle\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.079517 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b22f0add-3876-4db6-a6ac-83bf95c37ea6-config-data\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.088131 4807 generic.go:334] "Generic (PLEG): container finished" podID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerID="b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb" exitCode=0 Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.088171 4807 generic.go:334] "Generic (PLEG): container finished" podID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerID="22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255" exitCode=2 Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.088179 4807 generic.go:334] "Generic (PLEG): container finished" podID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerID="5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c" exitCode=0 Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.088197 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a3d55df-3e92-4cb5-aedd-7589b72d5471","Type":"ContainerDied","Data":"b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb"} Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.088221 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a3d55df-3e92-4cb5-aedd-7589b72d5471","Type":"ContainerDied","Data":"22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255"} Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.088282 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a3d55df-3e92-4cb5-aedd-7589b72d5471","Type":"ContainerDied","Data":"5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c"} Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.099868 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsmws\" (UniqueName: \"kubernetes.io/projected/b22f0add-3876-4db6-a6ac-83bf95c37ea6-kube-api-access-bsmws\") pod \"barbican-worker-5df9b5c779-cqvbn\" (UID: \"b22f0add-3876-4db6-a6ac-83bf95c37ea6\") " pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.103625 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drw22\" (UniqueName: \"kubernetes.io/projected/25b00dcc-1d4a-4d61-9865-db7b0515e360-kube-api-access-drw22\") pod \"barbican-keystone-listener-67945bfd5d-wnmj5\" (UID: \"25b00dcc-1d4a-4d61-9865-db7b0515e360\") " pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.136802 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-gj2cp"] Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.183232 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gvl2d"] Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.184766 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.231199 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4ph8\" (UniqueName: \"kubernetes.io/projected/7af73ef1-bb7d-4575-bc4e-3cff7945f644-kube-api-access-g4ph8\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.242853 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-config\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.242939 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.242995 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.243065 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.243168 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.232573 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.258183 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gvl2d"] Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.282277 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5df9b5c779-cqvbn" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.301318 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7fb8574d4b-zwdrx"] Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.302903 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.311533 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.322413 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7fb8574d4b-zwdrx"] Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344535 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data-custom\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344582 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-combined-ca-bundle\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344619 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344651 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4ph8\" (UniqueName: \"kubernetes.io/projected/7af73ef1-bb7d-4575-bc4e-3cff7945f644-kube-api-access-g4ph8\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344685 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-config\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344711 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344738 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344760 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f4nj\" (UniqueName: \"kubernetes.io/projected/ab376f86-cb64-4c79-a6ec-701e75c42a9f-kube-api-access-7f4nj\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344787 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344809 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab376f86-cb64-4c79-a6ec-701e75c42a9f-logs\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.344842 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.345723 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.346550 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-config\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.347079 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.348192 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.348847 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.364510 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4ph8\" (UniqueName: \"kubernetes.io/projected/7af73ef1-bb7d-4575-bc4e-3cff7945f644-kube-api-access-g4ph8\") pod \"dnsmasq-dns-5c9776ccc5-gvl2d\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.446153 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data-custom\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.446208 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-combined-ca-bundle\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.446259 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.446329 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f4nj\" (UniqueName: \"kubernetes.io/projected/ab376f86-cb64-4c79-a6ec-701e75c42a9f-kube-api-access-7f4nj\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.446365 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab376f86-cb64-4c79-a6ec-701e75c42a9f-logs\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.446790 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab376f86-cb64-4c79-a6ec-701e75c42a9f-logs\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.455384 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.456046 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-combined-ca-bundle\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.456212 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data-custom\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.492959 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f4nj\" (UniqueName: \"kubernetes.io/projected/ab376f86-cb64-4c79-a6ec-701e75c42a9f-kube-api-access-7f4nj\") pod \"barbican-api-7fb8574d4b-zwdrx\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.525469 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.574774 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-gj2cp"] Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.626581 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.668496 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.684044 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 11:27:15 crc kubenswrapper[4807]: I1127 11:27:15.831291 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67945bfd5d-wnmj5"] Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.007285 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5df9b5c779-cqvbn"] Nov 27 11:27:16 crc kubenswrapper[4807]: W1127 11:27:16.018765 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb22f0add_3876_4db6_a6ac_83bf95c37ea6.slice/crio-c8dc9567e6e54e685c3e685af3a8b52957d86a25d6885398f185bcfd2dd30afb WatchSource:0}: Error finding container c8dc9567e6e54e685c3e685af3a8b52957d86a25d6885398f185bcfd2dd30afb: Status 404 returned error can't find the container with id c8dc9567e6e54e685c3e685af3a8b52957d86a25d6885398f185bcfd2dd30afb Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.099988 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" event={"ID":"25b00dcc-1d4a-4d61-9865-db7b0515e360","Type":"ContainerStarted","Data":"39332fa94a95f463f1fdd541b80550891a9a3951c183617f3c8285b327d0b007"} Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.101312 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"390b4557-5aed-4b7a-ab55-6e68fd33fd54","Type":"ContainerStarted","Data":"9086913e4f9df184749bc12a29dbdec5fca73cdbe7cbd1fee0fbfe56c8552bae"} Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.108101 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5df9b5c779-cqvbn" event={"ID":"b22f0add-3876-4db6-a6ac-83bf95c37ea6","Type":"ContainerStarted","Data":"c8dc9567e6e54e685c3e685af3a8b52957d86a25d6885398f185bcfd2dd30afb"} Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.112151 4807 generic.go:334] "Generic (PLEG): container finished" podID="c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" containerID="0b755a223299fe5b053e4f8614fb91272f43c0049c02e6fc655959590e3aaf78" exitCode=0 Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.112477 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-gj2cp" event={"ID":"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6","Type":"ContainerDied","Data":"0b755a223299fe5b053e4f8614fb91272f43c0049c02e6fc655959590e3aaf78"} Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.112531 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-gj2cp" event={"ID":"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6","Type":"ContainerStarted","Data":"d0dd5fe0a3bc06d8efe2a9058f3461ea78a5900821138ef687c30b84ce44949e"} Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.115365 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2062d69b-506e-4da5-8152-de320573bf94","Type":"ContainerStarted","Data":"7674985ba05511335ef6b5ab8600ce67dbea98dba7ce080fd4121fc74d8546d1"} Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.148458 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gvl2d"] Nov 27 11:27:16 crc kubenswrapper[4807]: W1127 11:27:16.153024 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7af73ef1_bb7d_4575_bc4e_3cff7945f644.slice/crio-ff8157db1781b5fcb60d1f5a5f7f1701ed6498f25257866abe985b6e0a2f4b1f WatchSource:0}: Error finding container ff8157db1781b5fcb60d1f5a5f7f1701ed6498f25257866abe985b6e0a2f4b1f: Status 404 returned error can't find the container with id ff8157db1781b5fcb60d1f5a5f7f1701ed6498f25257866abe985b6e0a2f4b1f Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.239641 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7fb8574d4b-zwdrx"] Nov 27 11:27:16 crc kubenswrapper[4807]: W1127 11:27:16.250707 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab376f86_cb64_4c79_a6ec_701e75c42a9f.slice/crio-52330ff553d797c7d02bad9b0fb5c193117a98bbb73364d40e08445379bb9642 WatchSource:0}: Error finding container 52330ff553d797c7d02bad9b0fb5c193117a98bbb73364d40e08445379bb9642: Status 404 returned error can't find the container with id 52330ff553d797c7d02bad9b0fb5c193117a98bbb73364d40e08445379bb9642 Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.426035 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.462721 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-nb\") pod \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.462857 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-config\") pod \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.462910 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-swift-storage-0\") pod \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.462943 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-svc\") pod \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.462988 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg9kp\" (UniqueName: \"kubernetes.io/projected/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-kube-api-access-jg9kp\") pod \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.463084 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-sb\") pod \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\" (UID: \"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6\") " Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.474789 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-kube-api-access-jg9kp" (OuterVolumeSpecName: "kube-api-access-jg9kp") pod "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" (UID: "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6"). InnerVolumeSpecName "kube-api-access-jg9kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.493880 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" (UID: "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.497586 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" (UID: "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.498065 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-config" (OuterVolumeSpecName: "config") pod "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" (UID: "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.507009 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" (UID: "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.513964 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" (UID: "c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.565441 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.565470 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.565480 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.565491 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.565500 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:16 crc kubenswrapper[4807]: I1127 11:27:16.565509 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg9kp\" (UniqueName: \"kubernetes.io/projected/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6-kube-api-access-jg9kp\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.067115 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.076201 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-config-data\") pod \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.076421 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjf4f\" (UniqueName: \"kubernetes.io/projected/7a3d55df-3e92-4cb5-aedd-7589b72d5471-kube-api-access-rjf4f\") pod \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.076446 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-run-httpd\") pod \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.076469 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-sg-core-conf-yaml\") pod \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.076530 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-scripts\") pod \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.076566 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-combined-ca-bundle\") pod \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.076618 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-log-httpd\") pod \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\" (UID: \"7a3d55df-3e92-4cb5-aedd-7589b72d5471\") " Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.077858 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7a3d55df-3e92-4cb5-aedd-7589b72d5471" (UID: "7a3d55df-3e92-4cb5-aedd-7589b72d5471"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.078650 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7a3d55df-3e92-4cb5-aedd-7589b72d5471" (UID: "7a3d55df-3e92-4cb5-aedd-7589b72d5471"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.092031 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3d55df-3e92-4cb5-aedd-7589b72d5471-kube-api-access-rjf4f" (OuterVolumeSpecName: "kube-api-access-rjf4f") pod "7a3d55df-3e92-4cb5-aedd-7589b72d5471" (UID: "7a3d55df-3e92-4cb5-aedd-7589b72d5471"). InnerVolumeSpecName "kube-api-access-rjf4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.097412 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-scripts" (OuterVolumeSpecName: "scripts") pod "7a3d55df-3e92-4cb5-aedd-7589b72d5471" (UID: "7a3d55df-3e92-4cb5-aedd-7589b72d5471"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.148385 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7a3d55df-3e92-4cb5-aedd-7589b72d5471" (UID: "7a3d55df-3e92-4cb5-aedd-7589b72d5471"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.166596 4807 generic.go:334] "Generic (PLEG): container finished" podID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerID="61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075" exitCode=0 Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.166674 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.166696 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a3d55df-3e92-4cb5-aedd-7589b72d5471","Type":"ContainerDied","Data":"61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075"} Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.166755 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7a3d55df-3e92-4cb5-aedd-7589b72d5471","Type":"ContainerDied","Data":"b241dac74f182dd7b3b450c1efda750c4107f8a5618fbce34b6c1226c009fe94"} Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.166772 4807 scope.go:117] "RemoveContainer" containerID="b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.172914 4807 generic.go:334] "Generic (PLEG): container finished" podID="7af73ef1-bb7d-4575-bc4e-3cff7945f644" containerID="2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525" exitCode=0 Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.172980 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" event={"ID":"7af73ef1-bb7d-4575-bc4e-3cff7945f644","Type":"ContainerDied","Data":"2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525"} Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.173005 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" event={"ID":"7af73ef1-bb7d-4575-bc4e-3cff7945f644","Type":"ContainerStarted","Data":"ff8157db1781b5fcb60d1f5a5f7f1701ed6498f25257866abe985b6e0a2f4b1f"} Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.178692 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.178718 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.178729 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.178738 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjf4f\" (UniqueName: \"kubernetes.io/projected/7a3d55df-3e92-4cb5-aedd-7589b72d5471-kube-api-access-rjf4f\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.178746 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3d55df-3e92-4cb5-aedd-7589b72d5471-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.180223 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-gj2cp" event={"ID":"c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6","Type":"ContainerDied","Data":"d0dd5fe0a3bc06d8efe2a9058f3461ea78a5900821138ef687c30b84ce44949e"} Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.180416 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-gj2cp" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.183739 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.184650 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb8574d4b-zwdrx" event={"ID":"ab376f86-cb64-4c79-a6ec-701e75c42a9f","Type":"ContainerStarted","Data":"48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9"} Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.184674 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb8574d4b-zwdrx" event={"ID":"ab376f86-cb64-4c79-a6ec-701e75c42a9f","Type":"ContainerStarted","Data":"132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131"} Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.184684 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb8574d4b-zwdrx" event={"ID":"ab376f86-cb64-4c79-a6ec-701e75c42a9f","Type":"ContainerStarted","Data":"52330ff553d797c7d02bad9b0fb5c193117a98bbb73364d40e08445379bb9642"} Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.185120 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.185138 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.223451 4807 scope.go:117] "RemoveContainer" containerID="22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.239714 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"390b4557-5aed-4b7a-ab55-6e68fd33fd54","Type":"ContainerStarted","Data":"88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb"} Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.289192 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a3d55df-3e92-4cb5-aedd-7589b72d5471" (UID: "7a3d55df-3e92-4cb5-aedd-7589b72d5471"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.323792 4807 scope.go:117] "RemoveContainer" containerID="61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.340896 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7fb8574d4b-zwdrx" podStartSLOduration=2.340875681 podStartE2EDuration="2.340875681s" podCreationTimestamp="2025-11-27 11:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:27:17.31247194 +0000 UTC m=+1078.411970138" watchObservedRunningTime="2025-11-27 11:27:17.340875681 +0000 UTC m=+1078.440373879" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.395435 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-config-data" (OuterVolumeSpecName: "config-data") pod "7a3d55df-3e92-4cb5-aedd-7589b72d5471" (UID: "7a3d55df-3e92-4cb5-aedd-7589b72d5471"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.396570 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.396590 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3d55df-3e92-4cb5-aedd-7589b72d5471-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.419730 4807 scope.go:117] "RemoveContainer" containerID="5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.429152 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-gj2cp"] Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.450034 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-gj2cp"] Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.531739 4807 scope.go:117] "RemoveContainer" containerID="b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb" Nov 27 11:27:17 crc kubenswrapper[4807]: E1127 11:27:17.532227 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb\": container with ID starting with b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb not found: ID does not exist" containerID="b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.532272 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb"} err="failed to get container status \"b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb\": rpc error: code = NotFound desc = could not find container \"b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb\": container with ID starting with b61bd02cde828614e4a42ffe807b90423086c50b0b6fc2c811188abd9d32bbdb not found: ID does not exist" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.532293 4807 scope.go:117] "RemoveContainer" containerID="22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255" Nov 27 11:27:17 crc kubenswrapper[4807]: E1127 11:27:17.532455 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255\": container with ID starting with 22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255 not found: ID does not exist" containerID="22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.532473 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255"} err="failed to get container status \"22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255\": rpc error: code = NotFound desc = could not find container \"22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255\": container with ID starting with 22b95ded5404140ba492ace9c977f3416a57075d3165023f5c16927ca40d0255 not found: ID does not exist" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.532488 4807 scope.go:117] "RemoveContainer" containerID="61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075" Nov 27 11:27:17 crc kubenswrapper[4807]: E1127 11:27:17.532777 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075\": container with ID starting with 61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075 not found: ID does not exist" containerID="61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.532800 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075"} err="failed to get container status \"61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075\": rpc error: code = NotFound desc = could not find container \"61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075\": container with ID starting with 61a0a483cdc37e763bcba6a67e1ca79f1c1a4c94a9cb3d7a613306bf0b770075 not found: ID does not exist" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.532814 4807 scope.go:117] "RemoveContainer" containerID="5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c" Nov 27 11:27:17 crc kubenswrapper[4807]: E1127 11:27:17.533232 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c\": container with ID starting with 5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c not found: ID does not exist" containerID="5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.533269 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c"} err="failed to get container status \"5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c\": rpc error: code = NotFound desc = could not find container \"5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c\": container with ID starting with 5d3da65c9b18a5f3bf257f966bafc7dd750b6a7b931e3d6cf672dea32a661b2c not found: ID does not exist" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.533284 4807 scope.go:117] "RemoveContainer" containerID="0b755a223299fe5b053e4f8614fb91272f43c0049c02e6fc655959590e3aaf78" Nov 27 11:27:17 crc kubenswrapper[4807]: I1127 11:27:17.558118 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" path="/var/lib/kubelet/pods/c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6/volumes" Nov 27 11:27:18 crc kubenswrapper[4807]: I1127 11:27:18.252076 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" event={"ID":"7af73ef1-bb7d-4575-bc4e-3cff7945f644","Type":"ContainerStarted","Data":"30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5"} Nov 27 11:27:18 crc kubenswrapper[4807]: I1127 11:27:18.252668 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:18 crc kubenswrapper[4807]: I1127 11:27:18.254842 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2062d69b-506e-4da5-8152-de320573bf94","Type":"ContainerStarted","Data":"9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da"} Nov 27 11:27:18 crc kubenswrapper[4807]: I1127 11:27:18.256653 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"390b4557-5aed-4b7a-ab55-6e68fd33fd54","Type":"ContainerStarted","Data":"6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1"} Nov 27 11:27:18 crc kubenswrapper[4807]: I1127 11:27:18.256857 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" containerName="cinder-api-log" containerID="cri-o://88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb" gracePeriod=30 Nov 27 11:27:18 crc kubenswrapper[4807]: I1127 11:27:18.256952 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" containerName="cinder-api" containerID="cri-o://6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1" gracePeriod=30 Nov 27 11:27:18 crc kubenswrapper[4807]: I1127 11:27:18.295155 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.295130351 podStartE2EDuration="4.295130351s" podCreationTimestamp="2025-11-27 11:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:27:18.291346621 +0000 UTC m=+1079.390844819" watchObservedRunningTime="2025-11-27 11:27:18.295130351 +0000 UTC m=+1079.394628569" Nov 27 11:27:18 crc kubenswrapper[4807]: I1127 11:27:18.295813 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" podStartSLOduration=4.295802218 podStartE2EDuration="4.295802218s" podCreationTimestamp="2025-11-27 11:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:27:18.274528126 +0000 UTC m=+1079.374026324" watchObservedRunningTime="2025-11-27 11:27:18.295802218 +0000 UTC m=+1079.395300446" Nov 27 11:27:18 crc kubenswrapper[4807]: I1127 11:27:18.914608 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.041403 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data\") pod \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.041726 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/390b4557-5aed-4b7a-ab55-6e68fd33fd54-etc-machine-id\") pod \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.041764 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data-custom\") pod \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.041784 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/390b4557-5aed-4b7a-ab55-6e68fd33fd54-logs\") pod \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.041849 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-combined-ca-bundle\") pod \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.041903 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9zkc\" (UniqueName: \"kubernetes.io/projected/390b4557-5aed-4b7a-ab55-6e68fd33fd54-kube-api-access-d9zkc\") pod \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.041997 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-scripts\") pod \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\" (UID: \"390b4557-5aed-4b7a-ab55-6e68fd33fd54\") " Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.042213 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/390b4557-5aed-4b7a-ab55-6e68fd33fd54-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "390b4557-5aed-4b7a-ab55-6e68fd33fd54" (UID: "390b4557-5aed-4b7a-ab55-6e68fd33fd54"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.042470 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/390b4557-5aed-4b7a-ab55-6e68fd33fd54-logs" (OuterVolumeSpecName: "logs") pod "390b4557-5aed-4b7a-ab55-6e68fd33fd54" (UID: "390b4557-5aed-4b7a-ab55-6e68fd33fd54"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.043609 4807 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/390b4557-5aed-4b7a-ab55-6e68fd33fd54-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.043630 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/390b4557-5aed-4b7a-ab55-6e68fd33fd54-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.045547 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-scripts" (OuterVolumeSpecName: "scripts") pod "390b4557-5aed-4b7a-ab55-6e68fd33fd54" (UID: "390b4557-5aed-4b7a-ab55-6e68fd33fd54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.046135 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390b4557-5aed-4b7a-ab55-6e68fd33fd54-kube-api-access-d9zkc" (OuterVolumeSpecName: "kube-api-access-d9zkc") pod "390b4557-5aed-4b7a-ab55-6e68fd33fd54" (UID: "390b4557-5aed-4b7a-ab55-6e68fd33fd54"). InnerVolumeSpecName "kube-api-access-d9zkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.046625 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "390b4557-5aed-4b7a-ab55-6e68fd33fd54" (UID: "390b4557-5aed-4b7a-ab55-6e68fd33fd54"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.070372 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "390b4557-5aed-4b7a-ab55-6e68fd33fd54" (UID: "390b4557-5aed-4b7a-ab55-6e68fd33fd54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.101939 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data" (OuterVolumeSpecName: "config-data") pod "390b4557-5aed-4b7a-ab55-6e68fd33fd54" (UID: "390b4557-5aed-4b7a-ab55-6e68fd33fd54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.147638 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.147675 4807 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.147691 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.147704 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9zkc\" (UniqueName: \"kubernetes.io/projected/390b4557-5aed-4b7a-ab55-6e68fd33fd54-kube-api-access-d9zkc\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.147716 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/390b4557-5aed-4b7a-ab55-6e68fd33fd54-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.267654 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2062d69b-506e-4da5-8152-de320573bf94","Type":"ContainerStarted","Data":"216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482"} Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.271553 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" event={"ID":"25b00dcc-1d4a-4d61-9865-db7b0515e360","Type":"ContainerStarted","Data":"476ad50f7929c1a99e19fab324ca8851c25e198cef1e792242806180b64564b8"} Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.271619 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" event={"ID":"25b00dcc-1d4a-4d61-9865-db7b0515e360","Type":"ContainerStarted","Data":"57c93360762b5fbec23f7805ff34fa11a7eef01eff3b27d5c95df2d2c71c0bec"} Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.275314 4807 generic.go:334] "Generic (PLEG): container finished" podID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" containerID="6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1" exitCode=0 Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.275383 4807 generic.go:334] "Generic (PLEG): container finished" podID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" containerID="88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb" exitCode=143 Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.275438 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"390b4557-5aed-4b7a-ab55-6e68fd33fd54","Type":"ContainerDied","Data":"6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1"} Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.275471 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"390b4557-5aed-4b7a-ab55-6e68fd33fd54","Type":"ContainerDied","Data":"88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb"} Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.275482 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"390b4557-5aed-4b7a-ab55-6e68fd33fd54","Type":"ContainerDied","Data":"9086913e4f9df184749bc12a29dbdec5fca73cdbe7cbd1fee0fbfe56c8552bae"} Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.275498 4807 scope.go:117] "RemoveContainer" containerID="6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.275781 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.282959 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5df9b5c779-cqvbn" event={"ID":"b22f0add-3876-4db6-a6ac-83bf95c37ea6","Type":"ContainerStarted","Data":"410b227a0e90b6583de3163f8a4c790f27872e6e568280ae34e4861324258ac3"} Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.283007 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5df9b5c779-cqvbn" event={"ID":"b22f0add-3876-4db6-a6ac-83bf95c37ea6","Type":"ContainerStarted","Data":"7366f646f68f9401e32cdad09eea9abb354904057aaafefbe3c6a222aff469cb"} Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.297022 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.9681545639999998 podStartE2EDuration="5.296998148s" podCreationTimestamp="2025-11-27 11:27:14 +0000 UTC" firstStartedPulling="2025-11-27 11:27:15.704527087 +0000 UTC m=+1076.804025285" lastFinishedPulling="2025-11-27 11:27:17.033370671 +0000 UTC m=+1078.132868869" observedRunningTime="2025-11-27 11:27:19.291978145 +0000 UTC m=+1080.391476353" watchObservedRunningTime="2025-11-27 11:27:19.296998148 +0000 UTC m=+1080.396496386" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.324505 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5df9b5c779-cqvbn" podStartSLOduration=2.988399671 podStartE2EDuration="5.324486985s" podCreationTimestamp="2025-11-27 11:27:14 +0000 UTC" firstStartedPulling="2025-11-27 11:27:16.023399058 +0000 UTC m=+1077.122897256" lastFinishedPulling="2025-11-27 11:27:18.359486372 +0000 UTC m=+1079.458984570" observedRunningTime="2025-11-27 11:27:19.317968522 +0000 UTC m=+1080.417466740" watchObservedRunningTime="2025-11-27 11:27:19.324486985 +0000 UTC m=+1080.423985183" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.349462 4807 scope.go:117] "RemoveContainer" containerID="88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.366700 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-67945bfd5d-wnmj5" podStartSLOduration=2.890026499 podStartE2EDuration="5.3666776s" podCreationTimestamp="2025-11-27 11:27:14 +0000 UTC" firstStartedPulling="2025-11-27 11:27:15.88770624 +0000 UTC m=+1076.987204438" lastFinishedPulling="2025-11-27 11:27:18.364357341 +0000 UTC m=+1079.463855539" observedRunningTime="2025-11-27 11:27:19.347598016 +0000 UTC m=+1080.447096244" watchObservedRunningTime="2025-11-27 11:27:19.3666776 +0000 UTC m=+1080.466175808" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.400090 4807 scope.go:117] "RemoveContainer" containerID="6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1" Nov 27 11:27:19 crc kubenswrapper[4807]: E1127 11:27:19.400614 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1\": container with ID starting with 6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1 not found: ID does not exist" containerID="6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.400656 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1"} err="failed to get container status \"6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1\": rpc error: code = NotFound desc = could not find container \"6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1\": container with ID starting with 6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1 not found: ID does not exist" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.400691 4807 scope.go:117] "RemoveContainer" containerID="88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb" Nov 27 11:27:19 crc kubenswrapper[4807]: E1127 11:27:19.401207 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb\": container with ID starting with 88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb not found: ID does not exist" containerID="88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.401240 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb"} err="failed to get container status \"88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb\": rpc error: code = NotFound desc = could not find container \"88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb\": container with ID starting with 88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb not found: ID does not exist" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.401288 4807 scope.go:117] "RemoveContainer" containerID="6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.407311 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.407506 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1"} err="failed to get container status \"6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1\": rpc error: code = NotFound desc = could not find container \"6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1\": container with ID starting with 6081e336b6587df62f52f694502cb706f6164b79153397e0371b49f0c035b5d1 not found: ID does not exist" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.407550 4807 scope.go:117] "RemoveContainer" containerID="88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.414165 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb"} err="failed to get container status \"88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb\": rpc error: code = NotFound desc = could not find container \"88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb\": container with ID starting with 88882a425b6d9614ad74b19d19d0002c830fc16cb8fed5589ff8ec818c5ee2bb not found: ID does not exist" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.429394 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.437206 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 27 11:27:19 crc kubenswrapper[4807]: E1127 11:27:19.437680 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" containerName="cinder-api-log" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.437703 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" containerName="cinder-api-log" Nov 27 11:27:19 crc kubenswrapper[4807]: E1127 11:27:19.437720 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" containerName="cinder-api" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.437728 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" containerName="cinder-api" Nov 27 11:27:19 crc kubenswrapper[4807]: E1127 11:27:19.437745 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="sg-core" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.437753 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="sg-core" Nov 27 11:27:19 crc kubenswrapper[4807]: E1127 11:27:19.437770 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="proxy-httpd" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.437778 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="proxy-httpd" Nov 27 11:27:19 crc kubenswrapper[4807]: E1127 11:27:19.437788 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="ceilometer-central-agent" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.437796 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="ceilometer-central-agent" Nov 27 11:27:19 crc kubenswrapper[4807]: E1127 11:27:19.437815 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" containerName="init" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.437822 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" containerName="init" Nov 27 11:27:19 crc kubenswrapper[4807]: E1127 11:27:19.437841 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="ceilometer-notification-agent" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.437849 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="ceilometer-notification-agent" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.438047 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="ceilometer-notification-agent" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.438068 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" containerName="cinder-api-log" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.438084 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ae0f5f-fe9d-43c4-b0ca-0c6cc891f5d6" containerName="init" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.438098 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" containerName="cinder-api" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.438108 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="ceilometer-central-agent" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.438122 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="proxy-httpd" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.438138 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" containerName="sg-core" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.439322 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.440964 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.443809 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.443996 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.447677 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.545279 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="390b4557-5aed-4b7a-ab55-6e68fd33fd54" path="/var/lib/kubelet/pods/390b4557-5aed-4b7a-ab55-6e68fd33fd54/volumes" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.555144 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.555186 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/722777ce-cfa6-4b7d-96ba-452a6998356d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.555214 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-config-data\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.555354 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-config-data-custom\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.555455 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722777ce-cfa6-4b7d-96ba-452a6998356d-logs\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.555618 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.555697 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.555767 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmn6r\" (UniqueName: \"kubernetes.io/projected/722777ce-cfa6-4b7d-96ba-452a6998356d-kube-api-access-cmn6r\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.555794 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-scripts\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.657367 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.657449 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.657503 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmn6r\" (UniqueName: \"kubernetes.io/projected/722777ce-cfa6-4b7d-96ba-452a6998356d-kube-api-access-cmn6r\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.657531 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-scripts\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.657590 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.657611 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/722777ce-cfa6-4b7d-96ba-452a6998356d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.657633 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-config-data\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.657656 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-config-data-custom\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.657686 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722777ce-cfa6-4b7d-96ba-452a6998356d-logs\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.658181 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722777ce-cfa6-4b7d-96ba-452a6998356d-logs\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.659129 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/722777ce-cfa6-4b7d-96ba-452a6998356d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.664194 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.664400 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.675557 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-config-data\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.676021 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.676432 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-scripts\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.676548 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.676630 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.679819 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmn6r\" (UniqueName: \"kubernetes.io/projected/722777ce-cfa6-4b7d-96ba-452a6998356d-kube-api-access-cmn6r\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.680068 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-config-data-custom\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.689286 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/722777ce-cfa6-4b7d-96ba-452a6998356d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"722777ce-cfa6-4b7d-96ba-452a6998356d\") " pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.771520 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 27 11:27:19 crc kubenswrapper[4807]: I1127 11:27:19.933323 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 27 11:27:20 crc kubenswrapper[4807]: W1127 11:27:20.303934 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod722777ce_cfa6_4b7d_96ba_452a6998356d.slice/crio-93de489035d5679198a985713d20ddfb3a1d45709d62b0a86e306447d782d05c WatchSource:0}: Error finding container 93de489035d5679198a985713d20ddfb3a1d45709d62b0a86e306447d782d05c: Status 404 returned error can't find the container with id 93de489035d5679198a985713d20ddfb3a1d45709d62b0a86e306447d782d05c Nov 27 11:27:20 crc kubenswrapper[4807]: I1127 11:27:20.310139 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 27 11:27:20 crc kubenswrapper[4807]: I1127 11:27:20.925400 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.282240 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7b4c869746-crx9p"] Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.285079 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.288133 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.288666 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.301625 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b4c869746-crx9p"] Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.310960 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"722777ce-cfa6-4b7d-96ba-452a6998356d","Type":"ContainerStarted","Data":"f94db070b7f45f3e0bcdb1012da6563ea37b3ac79c363f5246a59930e992bd68"} Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.311017 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"722777ce-cfa6-4b7d-96ba-452a6998356d","Type":"ContainerStarted","Data":"93de489035d5679198a985713d20ddfb3a1d45709d62b0a86e306447d782d05c"} Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.390308 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-public-tls-certs\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.390370 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-logs\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.390420 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-combined-ca-bundle\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.390449 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd7hz\" (UniqueName: \"kubernetes.io/projected/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-kube-api-access-nd7hz\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.391513 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-internal-tls-certs\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.391585 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-config-data-custom\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.391790 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-config-data\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.495069 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-logs\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.495169 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-combined-ca-bundle\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.495209 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd7hz\" (UniqueName: \"kubernetes.io/projected/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-kube-api-access-nd7hz\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.495262 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-internal-tls-certs\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.495283 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-config-data-custom\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.496237 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-logs\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.497385 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-config-data\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.497531 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-public-tls-certs\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.500587 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-internal-tls-certs\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.505432 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-config-data-custom\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.511861 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-config-data\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.512190 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd7hz\" (UniqueName: \"kubernetes.io/projected/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-kube-api-access-nd7hz\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.519462 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-public-tls-certs\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.522899 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ebef31-c3b4-4d86-96b1-92bb2038fcc2-combined-ca-bundle\") pod \"barbican-api-7b4c869746-crx9p\" (UID: \"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2\") " pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:21 crc kubenswrapper[4807]: I1127 11:27:21.604893 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:22 crc kubenswrapper[4807]: I1127 11:27:22.055967 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7b4c869746-crx9p"] Nov 27 11:27:22 crc kubenswrapper[4807]: I1127 11:27:22.339436 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"722777ce-cfa6-4b7d-96ba-452a6998356d","Type":"ContainerStarted","Data":"1baececeb64c05032dc42560f5d25b471dc66c843b5dbc6b9b76e1ccd4188820"} Nov 27 11:27:22 crc kubenswrapper[4807]: I1127 11:27:22.342590 4807 generic.go:334] "Generic (PLEG): container finished" podID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" containerID="e5cf7ce9b0cdb612da9ff66a1d8fbc4233e4c7ae33c0216e6a817a6e3a7c29cc" exitCode=137 Nov 27 11:27:22 crc kubenswrapper[4807]: I1127 11:27:22.342624 4807 generic.go:334] "Generic (PLEG): container finished" podID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" containerID="3f8babf2fcc7df3c6ede7d3b902b0240fcef73912c000b5339a196262faab499" exitCode=137 Nov 27 11:27:22 crc kubenswrapper[4807]: I1127 11:27:22.342668 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84ddb7cbfc-7lt2m" event={"ID":"a0ccd0fd-c32d-40c1-b281-35b2b322faf7","Type":"ContainerDied","Data":"e5cf7ce9b0cdb612da9ff66a1d8fbc4233e4c7ae33c0216e6a817a6e3a7c29cc"} Nov 27 11:27:22 crc kubenswrapper[4807]: I1127 11:27:22.342714 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84ddb7cbfc-7lt2m" event={"ID":"a0ccd0fd-c32d-40c1-b281-35b2b322faf7","Type":"ContainerDied","Data":"3f8babf2fcc7df3c6ede7d3b902b0240fcef73912c000b5339a196262faab499"} Nov 27 11:27:22 crc kubenswrapper[4807]: I1127 11:27:22.343702 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b4c869746-crx9p" event={"ID":"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2","Type":"ContainerStarted","Data":"46ee89a87b41da208508366a7b5c43ceb5f0848a34c66ea3949337d7473dcc6d"} Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.032148 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.140634 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-config-data\") pod \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.141204 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjs2l\" (UniqueName: \"kubernetes.io/projected/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-kube-api-access-mjs2l\") pod \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.141619 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-logs\") pod \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.142400 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-scripts\") pod \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.142568 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-horizon-secret-key\") pod \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\" (UID: \"a0ccd0fd-c32d-40c1-b281-35b2b322faf7\") " Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.143171 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-logs" (OuterVolumeSpecName: "logs") pod "a0ccd0fd-c32d-40c1-b281-35b2b322faf7" (UID: "a0ccd0fd-c32d-40c1-b281-35b2b322faf7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.143515 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.157881 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a0ccd0fd-c32d-40c1-b281-35b2b322faf7" (UID: "a0ccd0fd-c32d-40c1-b281-35b2b322faf7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.167453 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-kube-api-access-mjs2l" (OuterVolumeSpecName: "kube-api-access-mjs2l") pod "a0ccd0fd-c32d-40c1-b281-35b2b322faf7" (UID: "a0ccd0fd-c32d-40c1-b281-35b2b322faf7"). InnerVolumeSpecName "kube-api-access-mjs2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.186177 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-config-data" (OuterVolumeSpecName: "config-data") pod "a0ccd0fd-c32d-40c1-b281-35b2b322faf7" (UID: "a0ccd0fd-c32d-40c1-b281-35b2b322faf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.188782 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-scripts" (OuterVolumeSpecName: "scripts") pod "a0ccd0fd-c32d-40c1-b281-35b2b322faf7" (UID: "a0ccd0fd-c32d-40c1-b281-35b2b322faf7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.245842 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.245887 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjs2l\" (UniqueName: \"kubernetes.io/projected/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-kube-api-access-mjs2l\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.245900 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.245909 4807 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a0ccd0fd-c32d-40c1-b281-35b2b322faf7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.352140 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b4c869746-crx9p" event={"ID":"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2","Type":"ContainerStarted","Data":"3f010c32eca920a3d0cf5640459791057f18b2f31a7532a4a100d3f0fe63a836"} Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.352176 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7b4c869746-crx9p" event={"ID":"b3ebef31-c3b4-4d86-96b1-92bb2038fcc2","Type":"ContainerStarted","Data":"094d81445d172d507c70fd80962146908a6bfa83499fe1df8f7a2ceb5b9df4ef"} Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.353080 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.353105 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.354821 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84ddb7cbfc-7lt2m" event={"ID":"a0ccd0fd-c32d-40c1-b281-35b2b322faf7","Type":"ContainerDied","Data":"732ddeea9dd1edc994c275c3541acd981660efcabeae12cc0e59a92242909ebf"} Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.354855 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84ddb7cbfc-7lt2m" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.354854 4807 scope.go:117] "RemoveContainer" containerID="e5cf7ce9b0cdb612da9ff66a1d8fbc4233e4c7ae33c0216e6a817a6e3a7c29cc" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.355013 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.388542 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7b4c869746-crx9p" podStartSLOduration=2.388521183 podStartE2EDuration="2.388521183s" podCreationTimestamp="2025-11-27 11:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:27:23.375447608 +0000 UTC m=+1084.474945816" watchObservedRunningTime="2025-11-27 11:27:23.388521183 +0000 UTC m=+1084.488019391" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.414853 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.414839729 podStartE2EDuration="4.414839729s" podCreationTimestamp="2025-11-27 11:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:27:23.412830826 +0000 UTC m=+1084.512329024" watchObservedRunningTime="2025-11-27 11:27:23.414839729 +0000 UTC m=+1084.514337927" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.437954 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84ddb7cbfc-7lt2m"] Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.446211 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84ddb7cbfc-7lt2m"] Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.447010 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-ff549ff99-zxxvk" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.498763 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b467c77b4-xkthn"] Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.498990 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b467c77b4-xkthn" podUID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerName="neutron-api" containerID="cri-o://ecad29e8a69528d2c562482c4041bf612ef0b19a9c64d780177334d565f7433c" gracePeriod=30 Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.499123 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-b467c77b4-xkthn" podUID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerName="neutron-httpd" containerID="cri-o://bc38cb6be7fa696e810dbf7a6fb2e74170135c020b51380d574297c7b79ab57d" gracePeriod=30 Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.543233 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" path="/var/lib/kubelet/pods/a0ccd0fd-c32d-40c1-b281-35b2b322faf7/volumes" Nov 27 11:27:23 crc kubenswrapper[4807]: I1127 11:27:23.553678 4807 scope.go:117] "RemoveContainer" containerID="3f8babf2fcc7df3c6ede7d3b902b0240fcef73912c000b5339a196262faab499" Nov 27 11:27:24 crc kubenswrapper[4807]: I1127 11:27:24.367300 4807 generic.go:334] "Generic (PLEG): container finished" podID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerID="bc38cb6be7fa696e810dbf7a6fb2e74170135c020b51380d574297c7b79ab57d" exitCode=0 Nov 27 11:27:24 crc kubenswrapper[4807]: I1127 11:27:24.367390 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b467c77b4-xkthn" event={"ID":"7d8965e4-cee0-4551-8bd1-f8322e804eef","Type":"ContainerDied","Data":"bc38cb6be7fa696e810dbf7a6fb2e74170135c020b51380d574297c7b79ab57d"} Nov 27 11:27:25 crc kubenswrapper[4807]: I1127 11:27:25.027271 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:27:25 crc kubenswrapper[4807]: I1127 11:27:25.151328 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 27 11:27:25 crc kubenswrapper[4807]: I1127 11:27:25.227403 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 11:27:25 crc kubenswrapper[4807]: I1127 11:27:25.383782 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2062d69b-506e-4da5-8152-de320573bf94" containerName="cinder-scheduler" containerID="cri-o://9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da" gracePeriod=30 Nov 27 11:27:25 crc kubenswrapper[4807]: I1127 11:27:25.383975 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="2062d69b-506e-4da5-8152-de320573bf94" containerName="probe" containerID="cri-o://216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482" gracePeriod=30 Nov 27 11:27:25 crc kubenswrapper[4807]: I1127 11:27:25.432985 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:27:25 crc kubenswrapper[4807]: I1127 11:27:25.527111 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:27:25 crc kubenswrapper[4807]: I1127 11:27:25.620969 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bcpjh"] Nov 27 11:27:25 crc kubenswrapper[4807]: I1127 11:27:25.621315 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" podUID="ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" containerName="dnsmasq-dns" containerID="cri-o://04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1" gracePeriod=10 Nov 27 11:27:25 crc kubenswrapper[4807]: I1127 11:27:25.805644 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" podUID="ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.150:5353: connect: connection refused" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.191830 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.315145 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-nb\") pod \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.315510 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6b47\" (UniqueName: \"kubernetes.io/projected/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-kube-api-access-d6b47\") pod \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.315636 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-swift-storage-0\") pod \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.315715 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-sb\") pod \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.315831 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-config\") pod \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.315969 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-svc\") pod \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\" (UID: \"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e\") " Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.322828 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-kube-api-access-d6b47" (OuterVolumeSpecName: "kube-api-access-d6b47") pod "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" (UID: "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e"). InnerVolumeSpecName "kube-api-access-d6b47". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.366841 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" (UID: "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.384263 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" (UID: "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.385171 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-config" (OuterVolumeSpecName: "config") pod "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" (UID: "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.391886 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" (UID: "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.393665 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" (UID: "ce34d4ab-e6ee-43c3-9ad7-c4312bae955e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.410196 4807 generic.go:334] "Generic (PLEG): container finished" podID="ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" containerID="04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1" exitCode=0 Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.410448 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.410441 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" event={"ID":"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e","Type":"ContainerDied","Data":"04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1"} Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.411071 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-bcpjh" event={"ID":"ce34d4ab-e6ee-43c3-9ad7-c4312bae955e","Type":"ContainerDied","Data":"9fcab10dbcaf8471a64ddfc0f84f9b4b1a7ab3a2843eb2be4874bb0fd19267d4"} Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.411102 4807 scope.go:117] "RemoveContainer" containerID="04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.418141 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.418160 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6b47\" (UniqueName: \"kubernetes.io/projected/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-kube-api-access-d6b47\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.418171 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.418179 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.418187 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.418196 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.419452 4807 generic.go:334] "Generic (PLEG): container finished" podID="2062d69b-506e-4da5-8152-de320573bf94" containerID="216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482" exitCode=0 Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.419505 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2062d69b-506e-4da5-8152-de320573bf94","Type":"ContainerDied","Data":"216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482"} Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.441674 4807 scope.go:117] "RemoveContainer" containerID="c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.447082 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bcpjh"] Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.459471 4807 scope.go:117] "RemoveContainer" containerID="04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1" Nov 27 11:27:26 crc kubenswrapper[4807]: E1127 11:27:26.459815 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1\": container with ID starting with 04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1 not found: ID does not exist" containerID="04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.459849 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1"} err="failed to get container status \"04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1\": rpc error: code = NotFound desc = could not find container \"04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1\": container with ID starting with 04788b28d9756e717adedcf5f850a9882ebeb4ba064d097cb874101afe08b0e1 not found: ID does not exist" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.459872 4807 scope.go:117] "RemoveContainer" containerID="c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e" Nov 27 11:27:26 crc kubenswrapper[4807]: E1127 11:27:26.460318 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e\": container with ID starting with c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e not found: ID does not exist" containerID="c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.460346 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e"} err="failed to get container status \"c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e\": rpc error: code = NotFound desc = could not find container \"c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e\": container with ID starting with c70f8d1e598b23a30ebcb412146eeced718b0f0771e5578519ea18581044905e not found: ID does not exist" Nov 27 11:27:26 crc kubenswrapper[4807]: I1127 11:27:26.460427 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-bcpjh"] Nov 27 11:27:27 crc kubenswrapper[4807]: I1127 11:27:27.080010 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:27:27 crc kubenswrapper[4807]: I1127 11:27:27.223008 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:27 crc kubenswrapper[4807]: I1127 11:27:27.625628 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" path="/var/lib/kubelet/pods/ce34d4ab-e6ee-43c3-9ad7-c4312bae955e/volumes" Nov 27 11:27:27 crc kubenswrapper[4807]: I1127 11:27:27.626651 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:27 crc kubenswrapper[4807]: I1127 11:27:27.634651 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7d69cff6fb-88t5t" Nov 27 11:27:27 crc kubenswrapper[4807]: I1127 11:27:27.767966 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b8bd6c76d-jg9hp"] Nov 27 11:27:27 crc kubenswrapper[4807]: I1127 11:27:27.768202 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b8bd6c76d-jg9hp" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon-log" containerID="cri-o://0431edfd02689c766bfba84526203198805f6a64725b59ad7a98d79e2b6095d8" gracePeriod=30 Nov 27 11:27:27 crc kubenswrapper[4807]: I1127 11:27:27.768656 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7b8bd6c76d-jg9hp" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon" containerID="cri-o://20965bc1b41dbc2aaabb0bc9a8457ca68ec138460ec493e4bad74931239240d6" gracePeriod=30 Nov 27 11:27:28 crc kubenswrapper[4807]: I1127 11:27:28.059139 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:27:28 crc kubenswrapper[4807]: I1127 11:27:28.060601 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5449cc7d8-rpm6t" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.276093 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.402785 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-combined-ca-bundle\") pod \"2062d69b-506e-4da5-8152-de320573bf94\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.402831 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-scripts\") pod \"2062d69b-506e-4da5-8152-de320573bf94\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.402914 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2062d69b-506e-4da5-8152-de320573bf94-etc-machine-id\") pod \"2062d69b-506e-4da5-8152-de320573bf94\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.402935 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data-custom\") pod \"2062d69b-506e-4da5-8152-de320573bf94\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.402964 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvqxc\" (UniqueName: \"kubernetes.io/projected/2062d69b-506e-4da5-8152-de320573bf94-kube-api-access-lvqxc\") pod \"2062d69b-506e-4da5-8152-de320573bf94\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.403015 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data\") pod \"2062d69b-506e-4da5-8152-de320573bf94\" (UID: \"2062d69b-506e-4da5-8152-de320573bf94\") " Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.403019 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2062d69b-506e-4da5-8152-de320573bf94-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2062d69b-506e-4da5-8152-de320573bf94" (UID: "2062d69b-506e-4da5-8152-de320573bf94"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.403531 4807 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2062d69b-506e-4da5-8152-de320573bf94-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.413371 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-scripts" (OuterVolumeSpecName: "scripts") pod "2062d69b-506e-4da5-8152-de320573bf94" (UID: "2062d69b-506e-4da5-8152-de320573bf94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.422921 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2062d69b-506e-4da5-8152-de320573bf94" (UID: "2062d69b-506e-4da5-8152-de320573bf94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.423762 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2062d69b-506e-4da5-8152-de320573bf94-kube-api-access-lvqxc" (OuterVolumeSpecName: "kube-api-access-lvqxc") pod "2062d69b-506e-4da5-8152-de320573bf94" (UID: "2062d69b-506e-4da5-8152-de320573bf94"). InnerVolumeSpecName "kube-api-access-lvqxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.457880 4807 generic.go:334] "Generic (PLEG): container finished" podID="2062d69b-506e-4da5-8152-de320573bf94" containerID="9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da" exitCode=0 Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.457929 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2062d69b-506e-4da5-8152-de320573bf94","Type":"ContainerDied","Data":"9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da"} Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.457961 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"2062d69b-506e-4da5-8152-de320573bf94","Type":"ContainerDied","Data":"7674985ba05511335ef6b5ab8600ce67dbea98dba7ce080fd4121fc74d8546d1"} Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.457981 4807 scope.go:117] "RemoveContainer" containerID="216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.458116 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.463237 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2062d69b-506e-4da5-8152-de320573bf94" (UID: "2062d69b-506e-4da5-8152-de320573bf94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.505446 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.505478 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.505488 4807 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.505498 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvqxc\" (UniqueName: \"kubernetes.io/projected/2062d69b-506e-4da5-8152-de320573bf94-kube-api-access-lvqxc\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.510753 4807 scope.go:117] "RemoveContainer" containerID="9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.528726 4807 scope.go:117] "RemoveContainer" containerID="216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482" Nov 27 11:27:29 crc kubenswrapper[4807]: E1127 11:27:29.529195 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482\": container with ID starting with 216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482 not found: ID does not exist" containerID="216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.529230 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482"} err="failed to get container status \"216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482\": rpc error: code = NotFound desc = could not find container \"216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482\": container with ID starting with 216940f38cd8ef8f8a5e3c7d183950a45a1fc9447110acb1eb26454e2df97482 not found: ID does not exist" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.529272 4807 scope.go:117] "RemoveContainer" containerID="9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da" Nov 27 11:27:29 crc kubenswrapper[4807]: E1127 11:27:29.529871 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da\": container with ID starting with 9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da not found: ID does not exist" containerID="9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.529910 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da"} err="failed to get container status \"9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da\": rpc error: code = NotFound desc = could not find container \"9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da\": container with ID starting with 9de6fcabf16c08c0b23c0d40829d98150f251b5f5d5f7d2acfb93cea528387da not found: ID does not exist" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.534669 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data" (OuterVolumeSpecName: "config-data") pod "2062d69b-506e-4da5-8152-de320573bf94" (UID: "2062d69b-506e-4da5-8152-de320573bf94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.607539 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2062d69b-506e-4da5-8152-de320573bf94-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.814808 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.824490 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.846132 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 11:27:29 crc kubenswrapper[4807]: E1127 11:27:29.846814 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" containerName="horizon-log" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.846836 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" containerName="horizon-log" Nov 27 11:27:29 crc kubenswrapper[4807]: E1127 11:27:29.846849 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" containerName="dnsmasq-dns" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.846855 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" containerName="dnsmasq-dns" Nov 27 11:27:29 crc kubenswrapper[4807]: E1127 11:27:29.846870 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" containerName="horizon" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.846877 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" containerName="horizon" Nov 27 11:27:29 crc kubenswrapper[4807]: E1127 11:27:29.846893 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2062d69b-506e-4da5-8152-de320573bf94" containerName="cinder-scheduler" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.846899 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2062d69b-506e-4da5-8152-de320573bf94" containerName="cinder-scheduler" Nov 27 11:27:29 crc kubenswrapper[4807]: E1127 11:27:29.846912 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" containerName="init" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.846918 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" containerName="init" Nov 27 11:27:29 crc kubenswrapper[4807]: E1127 11:27:29.846938 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2062d69b-506e-4da5-8152-de320573bf94" containerName="probe" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.846943 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2062d69b-506e-4da5-8152-de320573bf94" containerName="probe" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.847114 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" containerName="horizon" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.847131 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2062d69b-506e-4da5-8152-de320573bf94" containerName="cinder-scheduler" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.847150 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ccd0fd-c32d-40c1-b281-35b2b322faf7" containerName="horizon-log" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.847157 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2062d69b-506e-4da5-8152-de320573bf94" containerName="probe" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.847168 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce34d4ab-e6ee-43c3-9ad7-c4312bae955e" containerName="dnsmasq-dns" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.848332 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.850828 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 27 11:27:29 crc kubenswrapper[4807]: I1127 11:27:29.862030 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.015897 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cvx6\" (UniqueName: \"kubernetes.io/projected/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-kube-api-access-5cvx6\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.015960 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.016056 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.016171 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.016197 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-scripts\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.016222 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-config-data\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.117641 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.117959 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-scripts\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.118062 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-config-data\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.118216 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvx6\" (UniqueName: \"kubernetes.io/projected/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-kube-api-access-5cvx6\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.118334 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.118439 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.118633 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.125365 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-scripts\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.125384 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-config-data\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.134530 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.139845 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cvx6\" (UniqueName: \"kubernetes.io/projected/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-kube-api-access-5cvx6\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.140870 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb01b182-7b19-44ea-874b-3ad6a1ebb6a7-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7\") " pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.181637 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 27 11:27:30 crc kubenswrapper[4807]: I1127 11:27:30.759027 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 27 11:27:30 crc kubenswrapper[4807]: W1127 11:27:30.764112 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb01b182_7b19_44ea_874b_3ad6a1ebb6a7.slice/crio-d57d313d6ffd032b3d1837f4d417e46b08f1ff4d35986ea213a40c5a3fcd304b WatchSource:0}: Error finding container d57d313d6ffd032b3d1837f4d417e46b08f1ff4d35986ea213a40c5a3fcd304b: Status 404 returned error can't find the container with id d57d313d6ffd032b3d1837f4d417e46b08f1ff4d35986ea213a40c5a3fcd304b Nov 27 11:27:31 crc kubenswrapper[4807]: I1127 11:27:31.480705 4807 generic.go:334] "Generic (PLEG): container finished" podID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerID="20965bc1b41dbc2aaabb0bc9a8457ca68ec138460ec493e4bad74931239240d6" exitCode=0 Nov 27 11:27:31 crc kubenswrapper[4807]: I1127 11:27:31.480771 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8bd6c76d-jg9hp" event={"ID":"972db85e-5d7f-4312-b2c1-36f3c4e697d3","Type":"ContainerDied","Data":"20965bc1b41dbc2aaabb0bc9a8457ca68ec138460ec493e4bad74931239240d6"} Nov 27 11:27:31 crc kubenswrapper[4807]: I1127 11:27:31.482712 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7","Type":"ContainerStarted","Data":"a382042d8f631586b1c7f2327d510f7ee93d2b1ba948b1b8818f9b23b5d7322a"} Nov 27 11:27:31 crc kubenswrapper[4807]: I1127 11:27:31.482757 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7","Type":"ContainerStarted","Data":"d57d313d6ffd032b3d1837f4d417e46b08f1ff4d35986ea213a40c5a3fcd304b"} Nov 27 11:27:31 crc kubenswrapper[4807]: I1127 11:27:31.550497 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2062d69b-506e-4da5-8152-de320573bf94" path="/var/lib/kubelet/pods/2062d69b-506e-4da5-8152-de320573bf94/volumes" Nov 27 11:27:31 crc kubenswrapper[4807]: I1127 11:27:31.588071 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 27 11:27:32 crc kubenswrapper[4807]: I1127 11:27:32.492316 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"eb01b182-7b19-44ea-874b-3ad6a1ebb6a7","Type":"ContainerStarted","Data":"4068f7ce9e61cfc315bf30d936feb9e014954074c37ca9e865287ada101d50c2"} Nov 27 11:27:32 crc kubenswrapper[4807]: I1127 11:27:32.511896 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.511881745 podStartE2EDuration="3.511881745s" podCreationTimestamp="2025-11-27 11:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:27:32.509932094 +0000 UTC m=+1093.609430292" watchObservedRunningTime="2025-11-27 11:27:32.511881745 +0000 UTC m=+1093.611379933" Nov 27 11:27:33 crc kubenswrapper[4807]: I1127 11:27:33.105041 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:33 crc kubenswrapper[4807]: I1127 11:27:33.153007 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b8bd6c76d-jg9hp" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 27 11:27:33 crc kubenswrapper[4807]: I1127 11:27:33.265264 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7b4c869746-crx9p" Nov 27 11:27:33 crc kubenswrapper[4807]: I1127 11:27:33.352340 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7fb8574d4b-zwdrx"] Nov 27 11:27:33 crc kubenswrapper[4807]: I1127 11:27:33.353037 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7fb8574d4b-zwdrx" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerName="barbican-api" containerID="cri-o://48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9" gracePeriod=30 Nov 27 11:27:33 crc kubenswrapper[4807]: I1127 11:27:33.356861 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7fb8574d4b-zwdrx" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerName="barbican-api-log" containerID="cri-o://132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131" gracePeriod=30 Nov 27 11:27:33 crc kubenswrapper[4807]: I1127 11:27:33.515039 4807 generic.go:334] "Generic (PLEG): container finished" podID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerID="132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131" exitCode=143 Nov 27 11:27:33 crc kubenswrapper[4807]: I1127 11:27:33.516139 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb8574d4b-zwdrx" event={"ID":"ab376f86-cb64-4c79-a6ec-701e75c42a9f","Type":"ContainerDied","Data":"132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131"} Nov 27 11:27:34 crc kubenswrapper[4807]: I1127 11:27:34.006071 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-54b9d76d5d-4mfvr" Nov 27 11:27:35 crc kubenswrapper[4807]: I1127 11:27:35.181907 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 27 11:27:36 crc kubenswrapper[4807]: I1127 11:27:36.561229 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fb8574d4b-zwdrx" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:50476->10.217.0.162:9311: read: connection reset by peer" Nov 27 11:27:36 crc kubenswrapper[4807]: I1127 11:27:36.561380 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7fb8574d4b-zwdrx" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:50462->10.217.0.162:9311: read: connection reset by peer" Nov 27 11:27:36 crc kubenswrapper[4807]: I1127 11:27:36.967521 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.076264 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f4nj\" (UniqueName: \"kubernetes.io/projected/ab376f86-cb64-4c79-a6ec-701e75c42a9f-kube-api-access-7f4nj\") pod \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.076349 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data-custom\") pod \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.076405 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-combined-ca-bundle\") pod \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.076444 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab376f86-cb64-4c79-a6ec-701e75c42a9f-logs\") pod \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.076475 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data\") pod \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\" (UID: \"ab376f86-cb64-4c79-a6ec-701e75c42a9f\") " Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.077680 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab376f86-cb64-4c79-a6ec-701e75c42a9f-logs" (OuterVolumeSpecName: "logs") pod "ab376f86-cb64-4c79-a6ec-701e75c42a9f" (UID: "ab376f86-cb64-4c79-a6ec-701e75c42a9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.082688 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ab376f86-cb64-4c79-a6ec-701e75c42a9f" (UID: "ab376f86-cb64-4c79-a6ec-701e75c42a9f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.085794 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab376f86-cb64-4c79-a6ec-701e75c42a9f-kube-api-access-7f4nj" (OuterVolumeSpecName: "kube-api-access-7f4nj") pod "ab376f86-cb64-4c79-a6ec-701e75c42a9f" (UID: "ab376f86-cb64-4c79-a6ec-701e75c42a9f"). InnerVolumeSpecName "kube-api-access-7f4nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.108607 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab376f86-cb64-4c79-a6ec-701e75c42a9f" (UID: "ab376f86-cb64-4c79-a6ec-701e75c42a9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.134774 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data" (OuterVolumeSpecName: "config-data") pod "ab376f86-cb64-4c79-a6ec-701e75c42a9f" (UID: "ab376f86-cb64-4c79-a6ec-701e75c42a9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.178178 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f4nj\" (UniqueName: \"kubernetes.io/projected/ab376f86-cb64-4c79-a6ec-701e75c42a9f-kube-api-access-7f4nj\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.178210 4807 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.178220 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.178228 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ab376f86-cb64-4c79-a6ec-701e75c42a9f-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.178238 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab376f86-cb64-4c79-a6ec-701e75c42a9f-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.576538 4807 generic.go:334] "Generic (PLEG): container finished" podID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerID="48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9" exitCode=0 Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.576597 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb8574d4b-zwdrx" event={"ID":"ab376f86-cb64-4c79-a6ec-701e75c42a9f","Type":"ContainerDied","Data":"48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9"} Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.576661 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7fb8574d4b-zwdrx" event={"ID":"ab376f86-cb64-4c79-a6ec-701e75c42a9f","Type":"ContainerDied","Data":"52330ff553d797c7d02bad9b0fb5c193117a98bbb73364d40e08445379bb9642"} Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.576696 4807 scope.go:117] "RemoveContainer" containerID="48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.576928 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7fb8574d4b-zwdrx" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.610296 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7fb8574d4b-zwdrx"] Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.616593 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7fb8574d4b-zwdrx"] Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.623189 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 27 11:27:37 crc kubenswrapper[4807]: E1127 11:27:37.623542 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerName="barbican-api-log" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.623555 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerName="barbican-api-log" Nov 27 11:27:37 crc kubenswrapper[4807]: E1127 11:27:37.623593 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerName="barbican-api" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.623598 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerName="barbican-api" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.623762 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerName="barbican-api" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.623781 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" containerName="barbican-api-log" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.624332 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.634226 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.634403 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.634535 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-h88r6" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.643476 4807 scope.go:117] "RemoveContainer" containerID="132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.652505 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.679231 4807 scope.go:117] "RemoveContainer" containerID="48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9" Nov 27 11:27:37 crc kubenswrapper[4807]: E1127 11:27:37.679768 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9\": container with ID starting with 48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9 not found: ID does not exist" containerID="48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.679813 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9"} err="failed to get container status \"48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9\": rpc error: code = NotFound desc = could not find container \"48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9\": container with ID starting with 48643af9b551b090c3d33f2d5ef22613ef6690d804f94b1c5d64008505b291e9 not found: ID does not exist" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.679840 4807 scope.go:117] "RemoveContainer" containerID="132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131" Nov 27 11:27:37 crc kubenswrapper[4807]: E1127 11:27:37.680189 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131\": container with ID starting with 132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131 not found: ID does not exist" containerID="132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.680211 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131"} err="failed to get container status \"132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131\": rpc error: code = NotFound desc = could not find container \"132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131\": container with ID starting with 132750a51fb2171b1d9d2429e2421a024cf4c4617bd219a0f513640f723e2131 not found: ID does not exist" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.689846 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/99f92409-b35d-4905-bdac-488235b8c054-openstack-config-secret\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.691316 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/99f92409-b35d-4905-bdac-488235b8c054-openstack-config\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.691356 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6kzc\" (UniqueName: \"kubernetes.io/projected/99f92409-b35d-4905-bdac-488235b8c054-kube-api-access-h6kzc\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.691403 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f92409-b35d-4905-bdac-488235b8c054-combined-ca-bundle\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.793127 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/99f92409-b35d-4905-bdac-488235b8c054-openstack-config\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.793178 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6kzc\" (UniqueName: \"kubernetes.io/projected/99f92409-b35d-4905-bdac-488235b8c054-kube-api-access-h6kzc\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.793223 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f92409-b35d-4905-bdac-488235b8c054-combined-ca-bundle\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.793281 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/99f92409-b35d-4905-bdac-488235b8c054-openstack-config-secret\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.794012 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/99f92409-b35d-4905-bdac-488235b8c054-openstack-config\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.798362 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/99f92409-b35d-4905-bdac-488235b8c054-openstack-config-secret\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.799256 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99f92409-b35d-4905-bdac-488235b8c054-combined-ca-bundle\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.811989 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6kzc\" (UniqueName: \"kubernetes.io/projected/99f92409-b35d-4905-bdac-488235b8c054-kube-api-access-h6kzc\") pod \"openstackclient\" (UID: \"99f92409-b35d-4905-bdac-488235b8c054\") " pod="openstack/openstackclient" Nov 27 11:27:37 crc kubenswrapper[4807]: I1127 11:27:37.950840 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 27 11:27:38 crc kubenswrapper[4807]: I1127 11:27:38.397183 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 27 11:27:38 crc kubenswrapper[4807]: I1127 11:27:38.586195 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"99f92409-b35d-4905-bdac-488235b8c054","Type":"ContainerStarted","Data":"56cbf71d5de70da530bf47c34aa6554bd64ca049016669782e290e519b9e67a6"} Nov 27 11:27:39 crc kubenswrapper[4807]: I1127 11:27:39.544718 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab376f86-cb64-4c79-a6ec-701e75c42a9f" path="/var/lib/kubelet/pods/ab376f86-cb64-4c79-a6ec-701e75c42a9f/volumes" Nov 27 11:27:40 crc kubenswrapper[4807]: I1127 11:27:40.393901 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.728583 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7b6fc97755-xnlzr"] Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.732368 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.737619 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.737742 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.737809 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.747057 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7b6fc97755-xnlzr"] Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.862343 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-run-httpd\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.862642 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-combined-ca-bundle\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.862663 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-etc-swift\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.862730 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-public-tls-certs\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.862746 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq58k\" (UniqueName: \"kubernetes.io/projected/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-kube-api-access-mq58k\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.862781 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-log-httpd\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.862802 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-internal-tls-certs\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.862861 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-config-data\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.963897 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-run-httpd\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.963977 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-combined-ca-bundle\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.963998 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-etc-swift\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.964062 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-public-tls-certs\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.964079 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq58k\" (UniqueName: \"kubernetes.io/projected/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-kube-api-access-mq58k\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.964116 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-log-httpd\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.964136 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-internal-tls-certs\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.964183 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-config-data\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.965723 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-log-httpd\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.965776 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-run-httpd\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.970772 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-internal-tls-certs\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.971240 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-etc-swift\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.972933 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-config-data\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.972975 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-combined-ca-bundle\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.986137 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-public-tls-certs\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:41 crc kubenswrapper[4807]: I1127 11:27:41.987561 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq58k\" (UniqueName: \"kubernetes.io/projected/ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257-kube-api-access-mq58k\") pod \"swift-proxy-7b6fc97755-xnlzr\" (UID: \"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257\") " pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:42 crc kubenswrapper[4807]: I1127 11:27:42.054338 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:42 crc kubenswrapper[4807]: I1127 11:27:42.593454 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7b6fc97755-xnlzr"] Nov 27 11:27:43 crc kubenswrapper[4807]: I1127 11:27:43.145742 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b8bd6c76d-jg9hp" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.496206 4807 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7a3d55df-3e92-4cb5-aedd-7589b72d5471"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7a3d55df-3e92-4cb5-aedd-7589b72d5471] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7a3d55df_3e92_4cb5_aedd_7589b72d5471.slice" Nov 27 11:27:47 crc kubenswrapper[4807]: E1127 11:27:47.496790 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod7a3d55df-3e92-4cb5-aedd-7589b72d5471] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod7a3d55df-3e92-4cb5-aedd-7589b72d5471] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7a3d55df_3e92_4cb5_aedd_7589b72d5471.slice" pod="openstack/ceilometer-0" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.678826 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"99f92409-b35d-4905-bdac-488235b8c054","Type":"ContainerStarted","Data":"efb9e9a9302bd25d7ec08256290a663e4625760bf001483e904b02e3d140e511"} Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.680544 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.680556 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b6fc97755-xnlzr" event={"ID":"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257","Type":"ContainerStarted","Data":"54b51923b0d34d96f31033e87a751b2a997f662517ff023f5a089cde283bc8b3"} Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.680667 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b6fc97755-xnlzr" event={"ID":"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257","Type":"ContainerStarted","Data":"feeb4f250fa1d464bca05a9b35baa5eb3dd19bc49f641a97fbfd5b3127b21819"} Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.697306 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.692286911 podStartE2EDuration="10.69729125s" podCreationTimestamp="2025-11-27 11:27:37 +0000 UTC" firstStartedPulling="2025-11-27 11:27:38.400450972 +0000 UTC m=+1099.499949170" lastFinishedPulling="2025-11-27 11:27:47.405455291 +0000 UTC m=+1108.504953509" observedRunningTime="2025-11-27 11:27:47.696984712 +0000 UTC m=+1108.796482910" watchObservedRunningTime="2025-11-27 11:27:47.69729125 +0000 UTC m=+1108.796789448" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.760412 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.798564 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.805712 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.808889 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.811050 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.811383 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.820061 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.885782 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-scripts\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.886099 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkxtr\" (UniqueName: \"kubernetes.io/projected/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-kube-api-access-xkxtr\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.886225 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-log-httpd\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.886328 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-config-data\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.886439 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-run-httpd\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.886510 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.886599 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.985018 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.985315 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerName="glance-log" containerID="cri-o://526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023" gracePeriod=30 Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.985435 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerName="glance-httpd" containerID="cri-o://001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db" gracePeriod=30 Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.988556 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-config-data\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.988644 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-run-httpd\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.988675 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.988729 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.988763 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-scripts\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.988839 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkxtr\" (UniqueName: \"kubernetes.io/projected/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-kube-api-access-xkxtr\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.988934 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-log-httpd\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.989769 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-log-httpd\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.990381 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-run-httpd\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.993278 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-scripts\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.995681 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.995899 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-config-data\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:47 crc kubenswrapper[4807]: I1127 11:27:47.998113 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:48 crc kubenswrapper[4807]: I1127 11:27:48.009447 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkxtr\" (UniqueName: \"kubernetes.io/projected/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-kube-api-access-xkxtr\") pod \"ceilometer-0\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " pod="openstack/ceilometer-0" Nov 27 11:27:48 crc kubenswrapper[4807]: I1127 11:27:48.126678 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:27:48 crc kubenswrapper[4807]: I1127 11:27:48.571031 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:27:48 crc kubenswrapper[4807]: I1127 11:27:48.690536 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b6fc97755-xnlzr" event={"ID":"ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257","Type":"ContainerStarted","Data":"69902b8986c3ebf9de23c5e8955cde313db9b565b714cdd5fe0e41cf8569ef63"} Nov 27 11:27:48 crc kubenswrapper[4807]: I1127 11:27:48.690673 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:48 crc kubenswrapper[4807]: I1127 11:27:48.690709 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:48 crc kubenswrapper[4807]: I1127 11:27:48.695470 4807 generic.go:334] "Generic (PLEG): container finished" podID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerID="526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023" exitCode=143 Nov 27 11:27:48 crc kubenswrapper[4807]: I1127 11:27:48.695547 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b84df57-c9f1-4b55-ab31-7133a1d0841f","Type":"ContainerDied","Data":"526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023"} Nov 27 11:27:48 crc kubenswrapper[4807]: I1127 11:27:48.697405 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f4c8710-cab5-4bf1-8bdd-86e6350e8058","Type":"ContainerStarted","Data":"9998d6c53d95f028880a1e60c79eb33067b25f3fd37277a213b89ed84540c7e3"} Nov 27 11:27:48 crc kubenswrapper[4807]: I1127 11:27:48.706790 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7b6fc97755-xnlzr" podStartSLOduration=7.706773482 podStartE2EDuration="7.706773482s" podCreationTimestamp="2025-11-27 11:27:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:27:48.705869338 +0000 UTC m=+1109.805367536" watchObservedRunningTime="2025-11-27 11:27:48.706773482 +0000 UTC m=+1109.806271680" Nov 27 11:27:49 crc kubenswrapper[4807]: I1127 11:27:49.548172 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3d55df-3e92-4cb5-aedd-7589b72d5471" path="/var/lib/kubelet/pods/7a3d55df-3e92-4cb5-aedd-7589b72d5471/volumes" Nov 27 11:27:49 crc kubenswrapper[4807]: I1127 11:27:49.707192 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f4c8710-cab5-4bf1-8bdd-86e6350e8058","Type":"ContainerStarted","Data":"5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288"} Nov 27 11:27:50 crc kubenswrapper[4807]: I1127 11:27:50.349175 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:27:50 crc kubenswrapper[4807]: I1127 11:27:50.717327 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f4c8710-cab5-4bf1-8bdd-86e6350e8058","Type":"ContainerStarted","Data":"8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30"} Nov 27 11:27:50 crc kubenswrapper[4807]: I1127 11:27:50.920066 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-b467c77b4-xkthn" podUID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerName="neutron-httpd" probeResult="failure" output="Get \"http://10.217.0.151:9696/\": dial tcp 10.217.0.151:9696: connect: connection refused" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.148722 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:39598->10.217.0.149:9292: read: connection reset by peer" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.149481 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:39606->10.217.0.149:9292: read: connection reset by peer" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.645701 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.734110 4807 generic.go:334] "Generic (PLEG): container finished" podID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerID="001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db" exitCode=0 Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.734174 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b84df57-c9f1-4b55-ab31-7133a1d0841f","Type":"ContainerDied","Data":"001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db"} Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.734204 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9b84df57-c9f1-4b55-ab31-7133a1d0841f","Type":"ContainerDied","Data":"36488c1345b93132845ea300ca5ccb92cdf660fd57a5523a785674d8644b31cd"} Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.734221 4807 scope.go:117] "RemoveContainer" containerID="001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.734671 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.737573 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f4c8710-cab5-4bf1-8bdd-86e6350e8058","Type":"ContainerStarted","Data":"149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c"} Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.758133 4807 scope.go:117] "RemoveContainer" containerID="526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.781100 4807 scope.go:117] "RemoveContainer" containerID="001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db" Nov 27 11:27:51 crc kubenswrapper[4807]: E1127 11:27:51.781566 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db\": container with ID starting with 001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db not found: ID does not exist" containerID="001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.781606 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db"} err="failed to get container status \"001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db\": rpc error: code = NotFound desc = could not find container \"001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db\": container with ID starting with 001077e87a62c11f8824eb2148bf6ad305af8a0bafe4d746200720ae51a1d2db not found: ID does not exist" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.781636 4807 scope.go:117] "RemoveContainer" containerID="526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023" Nov 27 11:27:51 crc kubenswrapper[4807]: E1127 11:27:51.782910 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023\": container with ID starting with 526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023 not found: ID does not exist" containerID="526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.782937 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023"} err="failed to get container status \"526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023\": rpc error: code = NotFound desc = could not find container \"526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023\": container with ID starting with 526d0bbbf0f2beccfa2ee989e3a938c3c9ce94aca54e0f5a3032d5ee616b4023 not found: ID does not exist" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.809997 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-logs\") pod \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.810052 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pd86\" (UniqueName: \"kubernetes.io/projected/9b84df57-c9f1-4b55-ab31-7133a1d0841f-kube-api-access-6pd86\") pod \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.810107 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-config-data\") pod \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.810143 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-httpd-run\") pod \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.810162 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-combined-ca-bundle\") pod \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.810185 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.810228 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-internal-tls-certs\") pod \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.810279 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-scripts\") pod \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\" (UID: \"9b84df57-c9f1-4b55-ab31-7133a1d0841f\") " Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.810552 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9b84df57-c9f1-4b55-ab31-7133a1d0841f" (UID: "9b84df57-c9f1-4b55-ab31-7133a1d0841f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.810746 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.810955 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-logs" (OuterVolumeSpecName: "logs") pod "9b84df57-c9f1-4b55-ab31-7133a1d0841f" (UID: "9b84df57-c9f1-4b55-ab31-7133a1d0841f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.820329 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-scripts" (OuterVolumeSpecName: "scripts") pod "9b84df57-c9f1-4b55-ab31-7133a1d0841f" (UID: "9b84df57-c9f1-4b55-ab31-7133a1d0841f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.821072 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "9b84df57-c9f1-4b55-ab31-7133a1d0841f" (UID: "9b84df57-c9f1-4b55-ab31-7133a1d0841f"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.830401 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b84df57-c9f1-4b55-ab31-7133a1d0841f-kube-api-access-6pd86" (OuterVolumeSpecName: "kube-api-access-6pd86") pod "9b84df57-c9f1-4b55-ab31-7133a1d0841f" (UID: "9b84df57-c9f1-4b55-ab31-7133a1d0841f"). InnerVolumeSpecName "kube-api-access-6pd86". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.859239 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b84df57-c9f1-4b55-ab31-7133a1d0841f" (UID: "9b84df57-c9f1-4b55-ab31-7133a1d0841f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.871318 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-config-data" (OuterVolumeSpecName: "config-data") pod "9b84df57-c9f1-4b55-ab31-7133a1d0841f" (UID: "9b84df57-c9f1-4b55-ab31-7133a1d0841f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.893349 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9b84df57-c9f1-4b55-ab31-7133a1d0841f" (UID: "9b84df57-c9f1-4b55-ab31-7133a1d0841f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.911926 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b84df57-c9f1-4b55-ab31-7133a1d0841f-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.911951 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pd86\" (UniqueName: \"kubernetes.io/projected/9b84df57-c9f1-4b55-ab31-7133a1d0841f-kube-api-access-6pd86\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.911962 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.911970 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.912001 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.912010 4807 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.912019 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b84df57-c9f1-4b55-ab31-7133a1d0841f-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:51 crc kubenswrapper[4807]: I1127 11:27:51.933582 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.013477 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.075287 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.078819 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.079308 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7b6fc97755-xnlzr" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.082472 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.101132 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:27:52 crc kubenswrapper[4807]: E1127 11:27:52.101500 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerName="glance-httpd" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.101520 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerName="glance-httpd" Nov 27 11:27:52 crc kubenswrapper[4807]: E1127 11:27:52.101555 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerName="glance-log" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.101562 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerName="glance-log" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.101752 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerName="glance-httpd" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.101772 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" containerName="glance-log" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.102657 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.107622 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.107815 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.139215 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.218404 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.218463 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.218544 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-config-data\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.218567 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-scripts\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.218907 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.219069 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04b42996-10c7-401c-b91b-e0ab4e100173-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.219123 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8csn\" (UniqueName: \"kubernetes.io/projected/04b42996-10c7-401c-b91b-e0ab4e100173-kube-api-access-f8csn\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.219171 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b42996-10c7-401c-b91b-e0ab4e100173-logs\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.321407 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.321456 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.321513 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-config-data\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.321545 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-scripts\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.321595 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.321624 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04b42996-10c7-401c-b91b-e0ab4e100173-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.321642 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8csn\" (UniqueName: \"kubernetes.io/projected/04b42996-10c7-401c-b91b-e0ab4e100173-kube-api-access-f8csn\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.321657 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b42996-10c7-401c-b91b-e0ab4e100173-logs\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.321725 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.322079 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04b42996-10c7-401c-b91b-e0ab4e100173-logs\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.322142 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/04b42996-10c7-401c-b91b-e0ab4e100173-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.332944 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-scripts\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.333282 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.337663 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-config-data\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.339549 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04b42996-10c7-401c-b91b-e0ab4e100173-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.341897 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8csn\" (UniqueName: \"kubernetes.io/projected/04b42996-10c7-401c-b91b-e0ab4e100173-kube-api-access-f8csn\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.353160 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-internal-api-0\" (UID: \"04b42996-10c7-401c-b91b-e0ab4e100173\") " pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.420081 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.578298 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-gw4hg"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.579671 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gw4hg" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.605674 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gw4hg"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.674741 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-pqzjj"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.676233 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pqzjj" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.687229 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-a42a-account-create-update-254qb"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.688952 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a42a-account-create-update-254qb" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.690820 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.704438 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pqzjj"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.715435 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a42a-account-create-update-254qb"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.729675 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n4sx\" (UniqueName: \"kubernetes.io/projected/7ba3ef20-046c-4460-8533-132da27f4e06-kube-api-access-2n4sx\") pod \"nova-api-db-create-gw4hg\" (UID: \"7ba3ef20-046c-4460-8533-132da27f4e06\") " pod="openstack/nova-api-db-create-gw4hg" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.729763 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba3ef20-046c-4460-8533-132da27f4e06-operator-scripts\") pod \"nova-api-db-create-gw4hg\" (UID: \"7ba3ef20-046c-4460-8533-132da27f4e06\") " pod="openstack/nova-api-db-create-gw4hg" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.771178 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-7t4pz"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.772346 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7t4pz" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.794319 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7t4pz"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.832112 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v9vn\" (UniqueName: \"kubernetes.io/projected/67c46baf-4492-41d3-a556-38709abf8e0c-kube-api-access-9v9vn\") pod \"nova-api-a42a-account-create-update-254qb\" (UID: \"67c46baf-4492-41d3-a556-38709abf8e0c\") " pod="openstack/nova-api-a42a-account-create-update-254qb" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.832195 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c46baf-4492-41d3-a556-38709abf8e0c-operator-scripts\") pod \"nova-api-a42a-account-create-update-254qb\" (UID: \"67c46baf-4492-41d3-a556-38709abf8e0c\") " pod="openstack/nova-api-a42a-account-create-update-254qb" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.834591 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n4sx\" (UniqueName: \"kubernetes.io/projected/7ba3ef20-046c-4460-8533-132da27f4e06-kube-api-access-2n4sx\") pod \"nova-api-db-create-gw4hg\" (UID: \"7ba3ef20-046c-4460-8533-132da27f4e06\") " pod="openstack/nova-api-db-create-gw4hg" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.834664 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba3ef20-046c-4460-8533-132da27f4e06-operator-scripts\") pod \"nova-api-db-create-gw4hg\" (UID: \"7ba3ef20-046c-4460-8533-132da27f4e06\") " pod="openstack/nova-api-db-create-gw4hg" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.834755 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djnsj\" (UniqueName: \"kubernetes.io/projected/817319b3-2356-48a2-9922-79cfb7b00623-kube-api-access-djnsj\") pod \"nova-cell0-db-create-pqzjj\" (UID: \"817319b3-2356-48a2-9922-79cfb7b00623\") " pod="openstack/nova-cell0-db-create-pqzjj" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.834786 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/817319b3-2356-48a2-9922-79cfb7b00623-operator-scripts\") pod \"nova-cell0-db-create-pqzjj\" (UID: \"817319b3-2356-48a2-9922-79cfb7b00623\") " pod="openstack/nova-cell0-db-create-pqzjj" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.836041 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba3ef20-046c-4460-8533-132da27f4e06-operator-scripts\") pod \"nova-api-db-create-gw4hg\" (UID: \"7ba3ef20-046c-4460-8533-132da27f4e06\") " pod="openstack/nova-api-db-create-gw4hg" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.868015 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n4sx\" (UniqueName: \"kubernetes.io/projected/7ba3ef20-046c-4460-8533-132da27f4e06-kube-api-access-2n4sx\") pod \"nova-api-db-create-gw4hg\" (UID: \"7ba3ef20-046c-4460-8533-132da27f4e06\") " pod="openstack/nova-api-db-create-gw4hg" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.877526 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7aae-account-create-update-5lvsf"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.878697 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.886582 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.908290 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7aae-account-create-update-5lvsf"] Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.914157 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gw4hg" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.936218 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djnsj\" (UniqueName: \"kubernetes.io/projected/817319b3-2356-48a2-9922-79cfb7b00623-kube-api-access-djnsj\") pod \"nova-cell0-db-create-pqzjj\" (UID: \"817319b3-2356-48a2-9922-79cfb7b00623\") " pod="openstack/nova-cell0-db-create-pqzjj" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.936285 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/817319b3-2356-48a2-9922-79cfb7b00623-operator-scripts\") pod \"nova-cell0-db-create-pqzjj\" (UID: \"817319b3-2356-48a2-9922-79cfb7b00623\") " pod="openstack/nova-cell0-db-create-pqzjj" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.936313 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v9vn\" (UniqueName: \"kubernetes.io/projected/67c46baf-4492-41d3-a556-38709abf8e0c-kube-api-access-9v9vn\") pod \"nova-api-a42a-account-create-update-254qb\" (UID: \"67c46baf-4492-41d3-a556-38709abf8e0c\") " pod="openstack/nova-api-a42a-account-create-update-254qb" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.936343 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmtq8\" (UniqueName: \"kubernetes.io/projected/25f3c060-bf19-4af4-be26-521942d50da4-kube-api-access-dmtq8\") pod \"nova-cell1-db-create-7t4pz\" (UID: \"25f3c060-bf19-4af4-be26-521942d50da4\") " pod="openstack/nova-cell1-db-create-7t4pz" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.936370 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25f3c060-bf19-4af4-be26-521942d50da4-operator-scripts\") pod \"nova-cell1-db-create-7t4pz\" (UID: \"25f3c060-bf19-4af4-be26-521942d50da4\") " pod="openstack/nova-cell1-db-create-7t4pz" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.936413 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c46baf-4492-41d3-a556-38709abf8e0c-operator-scripts\") pod \"nova-api-a42a-account-create-update-254qb\" (UID: \"67c46baf-4492-41d3-a556-38709abf8e0c\") " pod="openstack/nova-api-a42a-account-create-update-254qb" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.937154 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c46baf-4492-41d3-a556-38709abf8e0c-operator-scripts\") pod \"nova-api-a42a-account-create-update-254qb\" (UID: \"67c46baf-4492-41d3-a556-38709abf8e0c\") " pod="openstack/nova-api-a42a-account-create-update-254qb" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.937537 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/817319b3-2356-48a2-9922-79cfb7b00623-operator-scripts\") pod \"nova-cell0-db-create-pqzjj\" (UID: \"817319b3-2356-48a2-9922-79cfb7b00623\") " pod="openstack/nova-cell0-db-create-pqzjj" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.959018 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v9vn\" (UniqueName: \"kubernetes.io/projected/67c46baf-4492-41d3-a556-38709abf8e0c-kube-api-access-9v9vn\") pod \"nova-api-a42a-account-create-update-254qb\" (UID: \"67c46baf-4492-41d3-a556-38709abf8e0c\") " pod="openstack/nova-api-a42a-account-create-update-254qb" Nov 27 11:27:52 crc kubenswrapper[4807]: I1127 11:27:52.961849 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djnsj\" (UniqueName: \"kubernetes.io/projected/817319b3-2356-48a2-9922-79cfb7b00623-kube-api-access-djnsj\") pod \"nova-cell0-db-create-pqzjj\" (UID: \"817319b3-2356-48a2-9922-79cfb7b00623\") " pod="openstack/nova-cell0-db-create-pqzjj" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.005714 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pqzjj" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.015717 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a42a-account-create-update-254qb" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.039216 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c6cc86c-1844-4444-85a9-370f5bc090d2-operator-scripts\") pod \"nova-cell0-7aae-account-create-update-5lvsf\" (UID: \"8c6cc86c-1844-4444-85a9-370f5bc090d2\") " pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.039529 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmtq8\" (UniqueName: \"kubernetes.io/projected/25f3c060-bf19-4af4-be26-521942d50da4-kube-api-access-dmtq8\") pod \"nova-cell1-db-create-7t4pz\" (UID: \"25f3c060-bf19-4af4-be26-521942d50da4\") " pod="openstack/nova-cell1-db-create-7t4pz" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.039618 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25f3c060-bf19-4af4-be26-521942d50da4-operator-scripts\") pod \"nova-cell1-db-create-7t4pz\" (UID: \"25f3c060-bf19-4af4-be26-521942d50da4\") " pod="openstack/nova-cell1-db-create-7t4pz" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.039825 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksjzb\" (UniqueName: \"kubernetes.io/projected/8c6cc86c-1844-4444-85a9-370f5bc090d2-kube-api-access-ksjzb\") pod \"nova-cell0-7aae-account-create-update-5lvsf\" (UID: \"8c6cc86c-1844-4444-85a9-370f5bc090d2\") " pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.041138 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25f3c060-bf19-4af4-be26-521942d50da4-operator-scripts\") pod \"nova-cell1-db-create-7t4pz\" (UID: \"25f3c060-bf19-4af4-be26-521942d50da4\") " pod="openstack/nova-cell1-db-create-7t4pz" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.067962 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmtq8\" (UniqueName: \"kubernetes.io/projected/25f3c060-bf19-4af4-be26-521942d50da4-kube-api-access-dmtq8\") pod \"nova-cell1-db-create-7t4pz\" (UID: \"25f3c060-bf19-4af4-be26-521942d50da4\") " pod="openstack/nova-cell1-db-create-7t4pz" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.098984 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7t4pz" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.107535 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6113-account-create-update-xmtrf"] Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.108749 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6113-account-create-update-xmtrf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.114596 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.115298 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6113-account-create-update-xmtrf"] Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.141593 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksjzb\" (UniqueName: \"kubernetes.io/projected/8c6cc86c-1844-4444-85a9-370f5bc090d2-kube-api-access-ksjzb\") pod \"nova-cell0-7aae-account-create-update-5lvsf\" (UID: \"8c6cc86c-1844-4444-85a9-370f5bc090d2\") " pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.141696 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c6cc86c-1844-4444-85a9-370f5bc090d2-operator-scripts\") pod \"nova-cell0-7aae-account-create-update-5lvsf\" (UID: \"8c6cc86c-1844-4444-85a9-370f5bc090d2\") " pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.143046 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c6cc86c-1844-4444-85a9-370f5bc090d2-operator-scripts\") pod \"nova-cell0-7aae-account-create-update-5lvsf\" (UID: \"8c6cc86c-1844-4444-85a9-370f5bc090d2\") " pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.154197 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7b8bd6c76d-jg9hp" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.154317 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.163899 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksjzb\" (UniqueName: \"kubernetes.io/projected/8c6cc86c-1844-4444-85a9-370f5bc090d2-kube-api-access-ksjzb\") pod \"nova-cell0-7aae-account-create-update-5lvsf\" (UID: \"8c6cc86c-1844-4444-85a9-370f5bc090d2\") " pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.189305 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.203671 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.247914 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf66f\" (UniqueName: \"kubernetes.io/projected/efa0aaff-659e-4816-9cba-298325fbc28c-kube-api-access-xf66f\") pod \"nova-cell1-6113-account-create-update-xmtrf\" (UID: \"efa0aaff-659e-4816-9cba-298325fbc28c\") " pod="openstack/nova-cell1-6113-account-create-update-xmtrf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.247997 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efa0aaff-659e-4816-9cba-298325fbc28c-operator-scripts\") pod \"nova-cell1-6113-account-create-update-xmtrf\" (UID: \"efa0aaff-659e-4816-9cba-298325fbc28c\") " pod="openstack/nova-cell1-6113-account-create-update-xmtrf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.349394 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf66f\" (UniqueName: \"kubernetes.io/projected/efa0aaff-659e-4816-9cba-298325fbc28c-kube-api-access-xf66f\") pod \"nova-cell1-6113-account-create-update-xmtrf\" (UID: \"efa0aaff-659e-4816-9cba-298325fbc28c\") " pod="openstack/nova-cell1-6113-account-create-update-xmtrf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.349575 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efa0aaff-659e-4816-9cba-298325fbc28c-operator-scripts\") pod \"nova-cell1-6113-account-create-update-xmtrf\" (UID: \"efa0aaff-659e-4816-9cba-298325fbc28c\") " pod="openstack/nova-cell1-6113-account-create-update-xmtrf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.350351 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efa0aaff-659e-4816-9cba-298325fbc28c-operator-scripts\") pod \"nova-cell1-6113-account-create-update-xmtrf\" (UID: \"efa0aaff-659e-4816-9cba-298325fbc28c\") " pod="openstack/nova-cell1-6113-account-create-update-xmtrf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.369314 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf66f\" (UniqueName: \"kubernetes.io/projected/efa0aaff-659e-4816-9cba-298325fbc28c-kube-api-access-xf66f\") pod \"nova-cell1-6113-account-create-update-xmtrf\" (UID: \"efa0aaff-659e-4816-9cba-298325fbc28c\") " pod="openstack/nova-cell1-6113-account-create-update-xmtrf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.431757 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-gw4hg"] Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.554967 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b84df57-c9f1-4b55-ab31-7133a1d0841f" path="/var/lib/kubelet/pods/9b84df57-c9f1-4b55-ab31-7133a1d0841f/volumes" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.564504 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6113-account-create-update-xmtrf" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.632793 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-a42a-account-create-update-254qb"] Nov 27 11:27:53 crc kubenswrapper[4807]: W1127 11:27:53.650984 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67c46baf_4492_41d3_a556_38709abf8e0c.slice/crio-e8460a5b2a8331c3aabac682417b6d506194742c4bd2c138599ee8d074b15d70 WatchSource:0}: Error finding container e8460a5b2a8331c3aabac682417b6d506194742c4bd2c138599ee8d074b15d70: Status 404 returned error can't find the container with id e8460a5b2a8331c3aabac682417b6d506194742c4bd2c138599ee8d074b15d70 Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.772206 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-7t4pz"] Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.813900 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7aae-account-create-update-5lvsf"] Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.843578 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pqzjj"] Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.861108 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a42a-account-create-update-254qb" event={"ID":"67c46baf-4492-41d3-a556-38709abf8e0c","Type":"ContainerStarted","Data":"e8460a5b2a8331c3aabac682417b6d506194742c4bd2c138599ee8d074b15d70"} Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.864379 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b467c77b4-xkthn_7d8965e4-cee0-4551-8bd1-f8322e804eef/neutron-api/0.log" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.864419 4807 generic.go:334] "Generic (PLEG): container finished" podID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerID="ecad29e8a69528d2c562482c4041bf612ef0b19a9c64d780177334d565f7433c" exitCode=137 Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.864464 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b467c77b4-xkthn" event={"ID":"7d8965e4-cee0-4551-8bd1-f8322e804eef","Type":"ContainerDied","Data":"ecad29e8a69528d2c562482c4041bf612ef0b19a9c64d780177334d565f7433c"} Nov 27 11:27:53 crc kubenswrapper[4807]: W1127 11:27:53.870923 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c6cc86c_1844_4444_85a9_370f5bc090d2.slice/crio-ca20a536be01f0ec21da44007a6f3bfcd74eb8d5812d861eacf0b77722e838fc WatchSource:0}: Error finding container ca20a536be01f0ec21da44007a6f3bfcd74eb8d5812d861eacf0b77722e838fc: Status 404 returned error can't find the container with id ca20a536be01f0ec21da44007a6f3bfcd74eb8d5812d861eacf0b77722e838fc Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.871240 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gw4hg" event={"ID":"7ba3ef20-046c-4460-8533-132da27f4e06","Type":"ContainerStarted","Data":"c0a0bf8dc8342630fc726a7a6966cac36479dcac76aeff10b6f25051ef5af7ae"} Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.871311 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gw4hg" event={"ID":"7ba3ef20-046c-4460-8533-132da27f4e06","Type":"ContainerStarted","Data":"b78953550edac2c28dd48af4fb8c128677793686cd8e85237dc74f29a55f57be"} Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.873970 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04b42996-10c7-401c-b91b-e0ab4e100173","Type":"ContainerStarted","Data":"2ef9829975031cb5f140ba5fb53712dad3176e339f1d4c8804c58adf2a628625"} Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.881394 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f4c8710-cab5-4bf1-8bdd-86e6350e8058","Type":"ContainerStarted","Data":"75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c"} Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.881551 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="ceilometer-central-agent" containerID="cri-o://5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288" gracePeriod=30 Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.881698 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.881746 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="proxy-httpd" containerID="cri-o://75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c" gracePeriod=30 Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.881786 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="sg-core" containerID="cri-o://149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c" gracePeriod=30 Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.881818 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="ceilometer-notification-agent" containerID="cri-o://8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30" gracePeriod=30 Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.898070 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-gw4hg" podStartSLOduration=1.89805124 podStartE2EDuration="1.89805124s" podCreationTimestamp="2025-11-27 11:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:27:53.888241293 +0000 UTC m=+1114.987739491" watchObservedRunningTime="2025-11-27 11:27:53.89805124 +0000 UTC m=+1114.997549438" Nov 27 11:27:53 crc kubenswrapper[4807]: I1127 11:27:53.955291 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.052164432 podStartE2EDuration="6.955270346s" podCreationTimestamp="2025-11-27 11:27:47 +0000 UTC" firstStartedPulling="2025-11-27 11:27:48.57283653 +0000 UTC m=+1109.672334728" lastFinishedPulling="2025-11-27 11:27:53.475942444 +0000 UTC m=+1114.575440642" observedRunningTime="2025-11-27 11:27:53.93174322 +0000 UTC m=+1115.031241408" watchObservedRunningTime="2025-11-27 11:27:53.955270346 +0000 UTC m=+1115.054768544" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.199528 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6113-account-create-update-xmtrf"] Nov 27 11:27:54 crc kubenswrapper[4807]: W1127 11:27:54.251630 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefa0aaff_659e_4816_9cba_298325fbc28c.slice/crio-63a8543c3cf83ac6277185c3de4c9c7b47b04a967bea1e4faafccdbd688a5e6b WatchSource:0}: Error finding container 63a8543c3cf83ac6277185c3de4c9c7b47b04a967bea1e4faafccdbd688a5e6b: Status 404 returned error can't find the container with id 63a8543c3cf83ac6277185c3de4c9c7b47b04a967bea1e4faafccdbd688a5e6b Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.253660 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b467c77b4-xkthn_7d8965e4-cee0-4551-8bd1-f8322e804eef/neutron-api/0.log" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.253728 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.370280 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcfjz\" (UniqueName: \"kubernetes.io/projected/7d8965e4-cee0-4551-8bd1-f8322e804eef-kube-api-access-wcfjz\") pod \"7d8965e4-cee0-4551-8bd1-f8322e804eef\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.370339 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-httpd-config\") pod \"7d8965e4-cee0-4551-8bd1-f8322e804eef\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.370428 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-combined-ca-bundle\") pod \"7d8965e4-cee0-4551-8bd1-f8322e804eef\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.370625 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-ovndb-tls-certs\") pod \"7d8965e4-cee0-4551-8bd1-f8322e804eef\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.370655 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-config\") pod \"7d8965e4-cee0-4551-8bd1-f8322e804eef\" (UID: \"7d8965e4-cee0-4551-8bd1-f8322e804eef\") " Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.376491 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7d8965e4-cee0-4551-8bd1-f8322e804eef" (UID: "7d8965e4-cee0-4551-8bd1-f8322e804eef"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.377417 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8965e4-cee0-4551-8bd1-f8322e804eef-kube-api-access-wcfjz" (OuterVolumeSpecName: "kube-api-access-wcfjz") pod "7d8965e4-cee0-4551-8bd1-f8322e804eef" (UID: "7d8965e4-cee0-4551-8bd1-f8322e804eef"). InnerVolumeSpecName "kube-api-access-wcfjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.418774 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d8965e4-cee0-4551-8bd1-f8322e804eef" (UID: "7d8965e4-cee0-4551-8bd1-f8322e804eef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.419919 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-config" (OuterVolumeSpecName: "config") pod "7d8965e4-cee0-4551-8bd1-f8322e804eef" (UID: "7d8965e4-cee0-4551-8bd1-f8322e804eef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.473240 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.473291 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcfjz\" (UniqueName: \"kubernetes.io/projected/7d8965e4-cee0-4551-8bd1-f8322e804eef-kube-api-access-wcfjz\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.473310 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.473324 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.504828 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7d8965e4-cee0-4551-8bd1-f8322e804eef" (UID: "7d8965e4-cee0-4551-8bd1-f8322e804eef"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.575159 4807 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d8965e4-cee0-4551-8bd1-f8322e804eef-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.896702 4807 generic.go:334] "Generic (PLEG): container finished" podID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerID="149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c" exitCode=2 Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.896978 4807 generic.go:334] "Generic (PLEG): container finished" podID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerID="8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30" exitCode=0 Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.896771 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f4c8710-cab5-4bf1-8bdd-86e6350e8058","Type":"ContainerDied","Data":"149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.897069 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f4c8710-cab5-4bf1-8bdd-86e6350e8058","Type":"ContainerDied","Data":"8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.901022 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6113-account-create-update-xmtrf" event={"ID":"efa0aaff-659e-4816-9cba-298325fbc28c","Type":"ContainerStarted","Data":"816e58a5835552db4016c0195edacf188db0a1379757b14a48c358cb20ac493b"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.901101 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6113-account-create-update-xmtrf" event={"ID":"efa0aaff-659e-4816-9cba-298325fbc28c","Type":"ContainerStarted","Data":"63a8543c3cf83ac6277185c3de4c9c7b47b04a967bea1e4faafccdbd688a5e6b"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.903953 4807 generic.go:334] "Generic (PLEG): container finished" podID="25f3c060-bf19-4af4-be26-521942d50da4" containerID="12409865f31fae447430df480013e4668bed3163998da7f70a270ef1e1b2799a" exitCode=0 Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.904028 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7t4pz" event={"ID":"25f3c060-bf19-4af4-be26-521942d50da4","Type":"ContainerDied","Data":"12409865f31fae447430df480013e4668bed3163998da7f70a270ef1e1b2799a"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.904060 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7t4pz" event={"ID":"25f3c060-bf19-4af4-be26-521942d50da4","Type":"ContainerStarted","Data":"bb5afa7c2f0682a89e877f64f9389b013676cbbabf6126b4b52fd589b5e0c4a1"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.906462 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04b42996-10c7-401c-b91b-e0ab4e100173","Type":"ContainerStarted","Data":"5b9a461c1be19461241b4e3be6bd039d58ff5dff1d09e5c8a6606430abbb31d7"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.909873 4807 generic.go:334] "Generic (PLEG): container finished" podID="7ba3ef20-046c-4460-8533-132da27f4e06" containerID="c0a0bf8dc8342630fc726a7a6966cac36479dcac76aeff10b6f25051ef5af7ae" exitCode=0 Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.909900 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gw4hg" event={"ID":"7ba3ef20-046c-4460-8533-132da27f4e06","Type":"ContainerDied","Data":"c0a0bf8dc8342630fc726a7a6966cac36479dcac76aeff10b6f25051ef5af7ae"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.911999 4807 generic.go:334] "Generic (PLEG): container finished" podID="817319b3-2356-48a2-9922-79cfb7b00623" containerID="5e50bcfadd380103be68eff6a81c37b3c4e3a0a6f66a1f7b471470bfaaba0b04" exitCode=0 Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.912041 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pqzjj" event={"ID":"817319b3-2356-48a2-9922-79cfb7b00623","Type":"ContainerDied","Data":"5e50bcfadd380103be68eff6a81c37b3c4e3a0a6f66a1f7b471470bfaaba0b04"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.912074 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pqzjj" event={"ID":"817319b3-2356-48a2-9922-79cfb7b00623","Type":"ContainerStarted","Data":"695f9e1a39e31ce0634864290b64ce18fadde8b633c25e3af0deff4ae756d4bb"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.914831 4807 generic.go:334] "Generic (PLEG): container finished" podID="67c46baf-4492-41d3-a556-38709abf8e0c" containerID="c44ef97808e4e5abd0e62650df1fe0e599294a67bd96897898931542cf97e03d" exitCode=0 Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.914878 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a42a-account-create-update-254qb" event={"ID":"67c46baf-4492-41d3-a556-38709abf8e0c","Type":"ContainerDied","Data":"c44ef97808e4e5abd0e62650df1fe0e599294a67bd96897898931542cf97e03d"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.926824 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-b467c77b4-xkthn_7d8965e4-cee0-4551-8bd1-f8322e804eef/neutron-api/0.log" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.926952 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b467c77b4-xkthn" event={"ID":"7d8965e4-cee0-4551-8bd1-f8322e804eef","Type":"ContainerDied","Data":"0232d38e27b74d1502e3e485b365c500c1732859da22264dab00cf51d08ceafb"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.926991 4807 scope.go:117] "RemoveContainer" containerID="bc38cb6be7fa696e810dbf7a6fb2e74170135c020b51380d574297c7b79ab57d" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.927177 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b467c77b4-xkthn" Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.934826 4807 generic.go:334] "Generic (PLEG): container finished" podID="8c6cc86c-1844-4444-85a9-370f5bc090d2" containerID="9c6ea88d3a58a15196122eb8ac4f99eacb5fb9228351bc90ed64c5134a32a727" exitCode=0 Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.934879 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" event={"ID":"8c6cc86c-1844-4444-85a9-370f5bc090d2","Type":"ContainerDied","Data":"9c6ea88d3a58a15196122eb8ac4f99eacb5fb9228351bc90ed64c5134a32a727"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.934908 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" event={"ID":"8c6cc86c-1844-4444-85a9-370f5bc090d2","Type":"ContainerStarted","Data":"ca20a536be01f0ec21da44007a6f3bfcd74eb8d5812d861eacf0b77722e838fc"} Nov 27 11:27:54 crc kubenswrapper[4807]: I1127 11:27:54.978323 4807 scope.go:117] "RemoveContainer" containerID="ecad29e8a69528d2c562482c4041bf612ef0b19a9c64d780177334d565f7433c" Nov 27 11:27:55 crc kubenswrapper[4807]: I1127 11:27:55.007730 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b467c77b4-xkthn"] Nov 27 11:27:55 crc kubenswrapper[4807]: I1127 11:27:55.015833 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b467c77b4-xkthn"] Nov 27 11:27:55 crc kubenswrapper[4807]: I1127 11:27:55.546592 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d8965e4-cee0-4551-8bd1-f8322e804eef" path="/var/lib/kubelet/pods/7d8965e4-cee0-4551-8bd1-f8322e804eef/volumes" Nov 27 11:27:55 crc kubenswrapper[4807]: I1127 11:27:55.945201 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"04b42996-10c7-401c-b91b-e0ab4e100173","Type":"ContainerStarted","Data":"ee2c003991d9dd8f4355fe42aa7cf63ce3a3bf6c9786c73aa9d639f9d2e25174"} Nov 27 11:27:55 crc kubenswrapper[4807]: I1127 11:27:55.946684 4807 generic.go:334] "Generic (PLEG): container finished" podID="efa0aaff-659e-4816-9cba-298325fbc28c" containerID="816e58a5835552db4016c0195edacf188db0a1379757b14a48c358cb20ac493b" exitCode=0 Nov 27 11:27:55 crc kubenswrapper[4807]: I1127 11:27:55.946753 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6113-account-create-update-xmtrf" event={"ID":"efa0aaff-659e-4816-9cba-298325fbc28c","Type":"ContainerDied","Data":"816e58a5835552db4016c0195edacf188db0a1379757b14a48c358cb20ac493b"} Nov 27 11:27:55 crc kubenswrapper[4807]: I1127 11:27:55.975217 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.975198363 podStartE2EDuration="3.975198363s" podCreationTimestamp="2025-11-27 11:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:27:55.970137351 +0000 UTC m=+1117.069635539" watchObservedRunningTime="2025-11-27 11:27:55.975198363 +0000 UTC m=+1117.074696561" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.368811 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pqzjj" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.519200 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/817319b3-2356-48a2-9922-79cfb7b00623-operator-scripts\") pod \"817319b3-2356-48a2-9922-79cfb7b00623\" (UID: \"817319b3-2356-48a2-9922-79cfb7b00623\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.519465 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djnsj\" (UniqueName: \"kubernetes.io/projected/817319b3-2356-48a2-9922-79cfb7b00623-kube-api-access-djnsj\") pod \"817319b3-2356-48a2-9922-79cfb7b00623\" (UID: \"817319b3-2356-48a2-9922-79cfb7b00623\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.521888 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/817319b3-2356-48a2-9922-79cfb7b00623-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "817319b3-2356-48a2-9922-79cfb7b00623" (UID: "817319b3-2356-48a2-9922-79cfb7b00623"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.533404 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/817319b3-2356-48a2-9922-79cfb7b00623-kube-api-access-djnsj" (OuterVolumeSpecName: "kube-api-access-djnsj") pod "817319b3-2356-48a2-9922-79cfb7b00623" (UID: "817319b3-2356-48a2-9922-79cfb7b00623"). InnerVolumeSpecName "kube-api-access-djnsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.621964 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/817319b3-2356-48a2-9922-79cfb7b00623-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.621995 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djnsj\" (UniqueName: \"kubernetes.io/projected/817319b3-2356-48a2-9922-79cfb7b00623-kube-api-access-djnsj\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.622420 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.669611 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6113-account-create-update-xmtrf" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.675880 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gw4hg" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.705495 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a42a-account-create-update-254qb" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.709646 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7t4pz" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.723227 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba3ef20-046c-4460-8533-132da27f4e06-operator-scripts\") pod \"7ba3ef20-046c-4460-8533-132da27f4e06\" (UID: \"7ba3ef20-046c-4460-8533-132da27f4e06\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.723282 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n4sx\" (UniqueName: \"kubernetes.io/projected/7ba3ef20-046c-4460-8533-132da27f4e06-kube-api-access-2n4sx\") pod \"7ba3ef20-046c-4460-8533-132da27f4e06\" (UID: \"7ba3ef20-046c-4460-8533-132da27f4e06\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.723321 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efa0aaff-659e-4816-9cba-298325fbc28c-operator-scripts\") pod \"efa0aaff-659e-4816-9cba-298325fbc28c\" (UID: \"efa0aaff-659e-4816-9cba-298325fbc28c\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.723375 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf66f\" (UniqueName: \"kubernetes.io/projected/efa0aaff-659e-4816-9cba-298325fbc28c-kube-api-access-xf66f\") pod \"efa0aaff-659e-4816-9cba-298325fbc28c\" (UID: \"efa0aaff-659e-4816-9cba-298325fbc28c\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.723404 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c6cc86c-1844-4444-85a9-370f5bc090d2-operator-scripts\") pod \"8c6cc86c-1844-4444-85a9-370f5bc090d2\" (UID: \"8c6cc86c-1844-4444-85a9-370f5bc090d2\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.723455 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksjzb\" (UniqueName: \"kubernetes.io/projected/8c6cc86c-1844-4444-85a9-370f5bc090d2-kube-api-access-ksjzb\") pod \"8c6cc86c-1844-4444-85a9-370f5bc090d2\" (UID: \"8c6cc86c-1844-4444-85a9-370f5bc090d2\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.725033 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efa0aaff-659e-4816-9cba-298325fbc28c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efa0aaff-659e-4816-9cba-298325fbc28c" (UID: "efa0aaff-659e-4816-9cba-298325fbc28c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.725392 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ba3ef20-046c-4460-8533-132da27f4e06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ba3ef20-046c-4460-8533-132da27f4e06" (UID: "7ba3ef20-046c-4460-8533-132da27f4e06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.727926 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6cc86c-1844-4444-85a9-370f5bc090d2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c6cc86c-1844-4444-85a9-370f5bc090d2" (UID: "8c6cc86c-1844-4444-85a9-370f5bc090d2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.728042 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efa0aaff-659e-4816-9cba-298325fbc28c-kube-api-access-xf66f" (OuterVolumeSpecName: "kube-api-access-xf66f") pod "efa0aaff-659e-4816-9cba-298325fbc28c" (UID: "efa0aaff-659e-4816-9cba-298325fbc28c"). InnerVolumeSpecName "kube-api-access-xf66f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.728794 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6cc86c-1844-4444-85a9-370f5bc090d2-kube-api-access-ksjzb" (OuterVolumeSpecName: "kube-api-access-ksjzb") pod "8c6cc86c-1844-4444-85a9-370f5bc090d2" (UID: "8c6cc86c-1844-4444-85a9-370f5bc090d2"). InnerVolumeSpecName "kube-api-access-ksjzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.738602 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ba3ef20-046c-4460-8533-132da27f4e06-kube-api-access-2n4sx" (OuterVolumeSpecName: "kube-api-access-2n4sx") pod "7ba3ef20-046c-4460-8533-132da27f4e06" (UID: "7ba3ef20-046c-4460-8533-132da27f4e06"). InnerVolumeSpecName "kube-api-access-2n4sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.825586 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25f3c060-bf19-4af4-be26-521942d50da4-operator-scripts\") pod \"25f3c060-bf19-4af4-be26-521942d50da4\" (UID: \"25f3c060-bf19-4af4-be26-521942d50da4\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.826009 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v9vn\" (UniqueName: \"kubernetes.io/projected/67c46baf-4492-41d3-a556-38709abf8e0c-kube-api-access-9v9vn\") pod \"67c46baf-4492-41d3-a556-38709abf8e0c\" (UID: \"67c46baf-4492-41d3-a556-38709abf8e0c\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.826042 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmtq8\" (UniqueName: \"kubernetes.io/projected/25f3c060-bf19-4af4-be26-521942d50da4-kube-api-access-dmtq8\") pod \"25f3c060-bf19-4af4-be26-521942d50da4\" (UID: \"25f3c060-bf19-4af4-be26-521942d50da4\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.826065 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c46baf-4492-41d3-a556-38709abf8e0c-operator-scripts\") pod \"67c46baf-4492-41d3-a556-38709abf8e0c\" (UID: \"67c46baf-4492-41d3-a556-38709abf8e0c\") " Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.826460 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksjzb\" (UniqueName: \"kubernetes.io/projected/8c6cc86c-1844-4444-85a9-370f5bc090d2-kube-api-access-ksjzb\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.826477 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ba3ef20-046c-4460-8533-132da27f4e06-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.826488 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n4sx\" (UniqueName: \"kubernetes.io/projected/7ba3ef20-046c-4460-8533-132da27f4e06-kube-api-access-2n4sx\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.826496 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efa0aaff-659e-4816-9cba-298325fbc28c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.826506 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf66f\" (UniqueName: \"kubernetes.io/projected/efa0aaff-659e-4816-9cba-298325fbc28c-kube-api-access-xf66f\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.826515 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c6cc86c-1844-4444-85a9-370f5bc090d2-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.826754 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67c46baf-4492-41d3-a556-38709abf8e0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67c46baf-4492-41d3-a556-38709abf8e0c" (UID: "67c46baf-4492-41d3-a556-38709abf8e0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.827088 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25f3c060-bf19-4af4-be26-521942d50da4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25f3c060-bf19-4af4-be26-521942d50da4" (UID: "25f3c060-bf19-4af4-be26-521942d50da4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.829295 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f3c060-bf19-4af4-be26-521942d50da4-kube-api-access-dmtq8" (OuterVolumeSpecName: "kube-api-access-dmtq8") pod "25f3c060-bf19-4af4-be26-521942d50da4" (UID: "25f3c060-bf19-4af4-be26-521942d50da4"). InnerVolumeSpecName "kube-api-access-dmtq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.830356 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c46baf-4492-41d3-a556-38709abf8e0c-kube-api-access-9v9vn" (OuterVolumeSpecName: "kube-api-access-9v9vn") pod "67c46baf-4492-41d3-a556-38709abf8e0c" (UID: "67c46baf-4492-41d3-a556-38709abf8e0c"). InnerVolumeSpecName "kube-api-access-9v9vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.928664 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25f3c060-bf19-4af4-be26-521942d50da4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.928695 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v9vn\" (UniqueName: \"kubernetes.io/projected/67c46baf-4492-41d3-a556-38709abf8e0c-kube-api-access-9v9vn\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.928704 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmtq8\" (UniqueName: \"kubernetes.io/projected/25f3c060-bf19-4af4-be26-521942d50da4-kube-api-access-dmtq8\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.928712 4807 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67c46baf-4492-41d3-a556-38709abf8e0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.974770 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6113-account-create-update-xmtrf" event={"ID":"efa0aaff-659e-4816-9cba-298325fbc28c","Type":"ContainerDied","Data":"63a8543c3cf83ac6277185c3de4c9c7b47b04a967bea1e4faafccdbd688a5e6b"} Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.974822 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a8543c3cf83ac6277185c3de4c9c7b47b04a967bea1e4faafccdbd688a5e6b" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.974927 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6113-account-create-update-xmtrf" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.987559 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pqzjj" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.987574 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pqzjj" event={"ID":"817319b3-2356-48a2-9922-79cfb7b00623","Type":"ContainerDied","Data":"695f9e1a39e31ce0634864290b64ce18fadde8b633c25e3af0deff4ae756d4bb"} Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.988132 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="695f9e1a39e31ce0634864290b64ce18fadde8b633c25e3af0deff4ae756d4bb" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.990012 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-7t4pz" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.989977 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-7t4pz" event={"ID":"25f3c060-bf19-4af4-be26-521942d50da4","Type":"ContainerDied","Data":"bb5afa7c2f0682a89e877f64f9389b013676cbbabf6126b4b52fd589b5e0c4a1"} Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.990325 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb5afa7c2f0682a89e877f64f9389b013676cbbabf6126b4b52fd589b5e0c4a1" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.992195 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-a42a-account-create-update-254qb" event={"ID":"67c46baf-4492-41d3-a556-38709abf8e0c","Type":"ContainerDied","Data":"e8460a5b2a8331c3aabac682417b6d506194742c4bd2c138599ee8d074b15d70"} Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.992234 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8460a5b2a8331c3aabac682417b6d506194742c4bd2c138599ee8d074b15d70" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.992422 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-a42a-account-create-update-254qb" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.995737 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" event={"ID":"8c6cc86c-1844-4444-85a9-370f5bc090d2","Type":"ContainerDied","Data":"ca20a536be01f0ec21da44007a6f3bfcd74eb8d5812d861eacf0b77722e838fc"} Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.995784 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca20a536be01f0ec21da44007a6f3bfcd74eb8d5812d861eacf0b77722e838fc" Nov 27 11:27:56 crc kubenswrapper[4807]: I1127 11:27:56.995854 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7aae-account-create-update-5lvsf" Nov 27 11:27:57 crc kubenswrapper[4807]: I1127 11:27:57.009989 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-gw4hg" Nov 27 11:27:57 crc kubenswrapper[4807]: I1127 11:27:57.010852 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-gw4hg" event={"ID":"7ba3ef20-046c-4460-8533-132da27f4e06","Type":"ContainerDied","Data":"b78953550edac2c28dd48af4fb8c128677793686cd8e85237dc74f29a55f57be"} Nov 27 11:27:57 crc kubenswrapper[4807]: I1127 11:27:57.010884 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b78953550edac2c28dd48af4fb8c128677793686cd8e85237dc74f29a55f57be" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.023335 4807 generic.go:334] "Generic (PLEG): container finished" podID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerID="0431edfd02689c766bfba84526203198805f6a64725b59ad7a98d79e2b6095d8" exitCode=137 Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.023410 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8bd6c76d-jg9hp" event={"ID":"972db85e-5d7f-4312-b2c1-36f3c4e697d3","Type":"ContainerDied","Data":"0431edfd02689c766bfba84526203198805f6a64725b59ad7a98d79e2b6095d8"} Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.083589 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.084054 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="da707f4d-a2cd-426f-b524-874435ef409c" containerName="glance-httpd" containerID="cri-o://edd43dc250c0190e440bb59c0b21000ef935603af1c0356ed90b85b963aa250e" gracePeriod=30 Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.083873 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="da707f4d-a2cd-426f-b524-874435ef409c" containerName="glance-log" containerID="cri-o://ad157d88570ad9ad10dd3f21af3b4681272f81ca4c53bccd1cecec85ce333532" gracePeriod=30 Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.117707 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cwjzz"] Nov 27 11:27:58 crc kubenswrapper[4807]: E1127 11:27:58.118063 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ba3ef20-046c-4460-8533-132da27f4e06" containerName="mariadb-database-create" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118076 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ba3ef20-046c-4460-8533-132da27f4e06" containerName="mariadb-database-create" Nov 27 11:27:58 crc kubenswrapper[4807]: E1127 11:27:58.118100 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c46baf-4492-41d3-a556-38709abf8e0c" containerName="mariadb-account-create-update" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118106 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c46baf-4492-41d3-a556-38709abf8e0c" containerName="mariadb-account-create-update" Nov 27 11:27:58 crc kubenswrapper[4807]: E1127 11:27:58.118115 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6cc86c-1844-4444-85a9-370f5bc090d2" containerName="mariadb-account-create-update" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118121 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6cc86c-1844-4444-85a9-370f5bc090d2" containerName="mariadb-account-create-update" Nov 27 11:27:58 crc kubenswrapper[4807]: E1127 11:27:58.118133 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerName="neutron-api" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118138 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerName="neutron-api" Nov 27 11:27:58 crc kubenswrapper[4807]: E1127 11:27:58.118174 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f3c060-bf19-4af4-be26-521942d50da4" containerName="mariadb-database-create" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118180 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f3c060-bf19-4af4-be26-521942d50da4" containerName="mariadb-database-create" Nov 27 11:27:58 crc kubenswrapper[4807]: E1127 11:27:58.118192 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerName="neutron-httpd" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118198 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerName="neutron-httpd" Nov 27 11:27:58 crc kubenswrapper[4807]: E1127 11:27:58.118214 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="817319b3-2356-48a2-9922-79cfb7b00623" containerName="mariadb-database-create" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118220 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="817319b3-2356-48a2-9922-79cfb7b00623" containerName="mariadb-database-create" Nov 27 11:27:58 crc kubenswrapper[4807]: E1127 11:27:58.118227 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa0aaff-659e-4816-9cba-298325fbc28c" containerName="mariadb-account-create-update" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118233 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa0aaff-659e-4816-9cba-298325fbc28c" containerName="mariadb-account-create-update" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118454 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerName="neutron-httpd" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118470 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa0aaff-659e-4816-9cba-298325fbc28c" containerName="mariadb-account-create-update" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118480 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f3c060-bf19-4af4-be26-521942d50da4" containerName="mariadb-database-create" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118489 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ba3ef20-046c-4460-8533-132da27f4e06" containerName="mariadb-database-create" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118497 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="817319b3-2356-48a2-9922-79cfb7b00623" containerName="mariadb-database-create" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118515 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6cc86c-1844-4444-85a9-370f5bc090d2" containerName="mariadb-account-create-update" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118524 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c46baf-4492-41d3-a556-38709abf8e0c" containerName="mariadb-account-create-update" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.118537 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8965e4-cee0-4551-8bd1-f8322e804eef" containerName="neutron-api" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.119749 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.122914 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.123127 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.123276 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9p2sx" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.150886 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.150922 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-scripts\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.150941 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvh9h\" (UniqueName: \"kubernetes.io/projected/5650a6a6-37ed-42ec-812a-f94eb9a92117-kube-api-access-gvh9h\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.151056 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-config-data\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.162167 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cwjzz"] Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.193044 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.253913 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-tls-certs\") pod \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.254310 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-scripts\") pod \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.254369 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-config-data\") pod \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.254403 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssbt6\" (UniqueName: \"kubernetes.io/projected/972db85e-5d7f-4312-b2c1-36f3c4e697d3-kube-api-access-ssbt6\") pod \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.254509 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/972db85e-5d7f-4312-b2c1-36f3c4e697d3-logs\") pod \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.254557 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-secret-key\") pod \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.254610 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-combined-ca-bundle\") pod \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\" (UID: \"972db85e-5d7f-4312-b2c1-36f3c4e697d3\") " Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.254881 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-scripts\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.254909 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.254934 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvh9h\" (UniqueName: \"kubernetes.io/projected/5650a6a6-37ed-42ec-812a-f94eb9a92117-kube-api-access-gvh9h\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.255039 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-config-data\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.255896 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/972db85e-5d7f-4312-b2c1-36f3c4e697d3-logs" (OuterVolumeSpecName: "logs") pod "972db85e-5d7f-4312-b2c1-36f3c4e697d3" (UID: "972db85e-5d7f-4312-b2c1-36f3c4e697d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.263503 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-config-data\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.265760 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-scripts\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.266214 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.275403 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "972db85e-5d7f-4312-b2c1-36f3c4e697d3" (UID: "972db85e-5d7f-4312-b2c1-36f3c4e697d3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.281404 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972db85e-5d7f-4312-b2c1-36f3c4e697d3-kube-api-access-ssbt6" (OuterVolumeSpecName: "kube-api-access-ssbt6") pod "972db85e-5d7f-4312-b2c1-36f3c4e697d3" (UID: "972db85e-5d7f-4312-b2c1-36f3c4e697d3"). InnerVolumeSpecName "kube-api-access-ssbt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.283481 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvh9h\" (UniqueName: \"kubernetes.io/projected/5650a6a6-37ed-42ec-812a-f94eb9a92117-kube-api-access-gvh9h\") pod \"nova-cell0-conductor-db-sync-cwjzz\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.295873 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-config-data" (OuterVolumeSpecName: "config-data") pod "972db85e-5d7f-4312-b2c1-36f3c4e697d3" (UID: "972db85e-5d7f-4312-b2c1-36f3c4e697d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.306603 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-scripts" (OuterVolumeSpecName: "scripts") pod "972db85e-5d7f-4312-b2c1-36f3c4e697d3" (UID: "972db85e-5d7f-4312-b2c1-36f3c4e697d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.317496 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "972db85e-5d7f-4312-b2c1-36f3c4e697d3" (UID: "972db85e-5d7f-4312-b2c1-36f3c4e697d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.335033 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "972db85e-5d7f-4312-b2c1-36f3c4e697d3" (UID: "972db85e-5d7f-4312-b2c1-36f3c4e697d3"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.357106 4807 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.357211 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.357314 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/972db85e-5d7f-4312-b2c1-36f3c4e697d3-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.357372 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssbt6\" (UniqueName: \"kubernetes.io/projected/972db85e-5d7f-4312-b2c1-36f3c4e697d3-kube-api-access-ssbt6\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.357428 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/972db85e-5d7f-4312-b2c1-36f3c4e697d3-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.357480 4807 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.357532 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972db85e-5d7f-4312-b2c1-36f3c4e697d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.510749 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:27:58 crc kubenswrapper[4807]: I1127 11:27:58.954768 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cwjzz"] Nov 27 11:27:59 crc kubenswrapper[4807]: I1127 11:27:59.035947 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7b8bd6c76d-jg9hp" Nov 27 11:27:59 crc kubenswrapper[4807]: I1127 11:27:59.036463 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7b8bd6c76d-jg9hp" event={"ID":"972db85e-5d7f-4312-b2c1-36f3c4e697d3","Type":"ContainerDied","Data":"0ac0798928a7da5c337222df919f478a70538f3778eb4e26cc5c2ba283aa1a5b"} Nov 27 11:27:59 crc kubenswrapper[4807]: I1127 11:27:59.036516 4807 scope.go:117] "RemoveContainer" containerID="20965bc1b41dbc2aaabb0bc9a8457ca68ec138460ec493e4bad74931239240d6" Nov 27 11:27:59 crc kubenswrapper[4807]: I1127 11:27:59.037776 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cwjzz" event={"ID":"5650a6a6-37ed-42ec-812a-f94eb9a92117","Type":"ContainerStarted","Data":"c4deb76f4b87c9707a0729d6f7fc51e1565055bc26ca90e6229e6f395834e59a"} Nov 27 11:27:59 crc kubenswrapper[4807]: I1127 11:27:59.040336 4807 generic.go:334] "Generic (PLEG): container finished" podID="da707f4d-a2cd-426f-b524-874435ef409c" containerID="ad157d88570ad9ad10dd3f21af3b4681272f81ca4c53bccd1cecec85ce333532" exitCode=143 Nov 27 11:27:59 crc kubenswrapper[4807]: I1127 11:27:59.040377 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"da707f4d-a2cd-426f-b524-874435ef409c","Type":"ContainerDied","Data":"ad157d88570ad9ad10dd3f21af3b4681272f81ca4c53bccd1cecec85ce333532"} Nov 27 11:27:59 crc kubenswrapper[4807]: I1127 11:27:59.067752 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7b8bd6c76d-jg9hp"] Nov 27 11:27:59 crc kubenswrapper[4807]: I1127 11:27:59.074792 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7b8bd6c76d-jg9hp"] Nov 27 11:27:59 crc kubenswrapper[4807]: I1127 11:27:59.196565 4807 scope.go:117] "RemoveContainer" containerID="0431edfd02689c766bfba84526203198805f6a64725b59ad7a98d79e2b6095d8" Nov 27 11:27:59 crc kubenswrapper[4807]: I1127 11:27:59.542087 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" path="/var/lib/kubelet/pods/972db85e-5d7f-4312-b2c1-36f3c4e697d3/volumes" Nov 27 11:28:01 crc kubenswrapper[4807]: I1127 11:28:01.067193 4807 generic.go:334] "Generic (PLEG): container finished" podID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerID="5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288" exitCode=0 Nov 27 11:28:01 crc kubenswrapper[4807]: I1127 11:28:01.067375 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f4c8710-cab5-4bf1-8bdd-86e6350e8058","Type":"ContainerDied","Data":"5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288"} Nov 27 11:28:02 crc kubenswrapper[4807]: I1127 11:28:02.079176 4807 generic.go:334] "Generic (PLEG): container finished" podID="da707f4d-a2cd-426f-b524-874435ef409c" containerID="edd43dc250c0190e440bb59c0b21000ef935603af1c0356ed90b85b963aa250e" exitCode=0 Nov 27 11:28:02 crc kubenswrapper[4807]: I1127 11:28:02.079224 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"da707f4d-a2cd-426f-b524-874435ef409c","Type":"ContainerDied","Data":"edd43dc250c0190e440bb59c0b21000ef935603af1c0356ed90b85b963aa250e"} Nov 27 11:28:02 crc kubenswrapper[4807]: I1127 11:28:02.420903 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 11:28:02 crc kubenswrapper[4807]: I1127 11:28:02.420946 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 27 11:28:02 crc kubenswrapper[4807]: I1127 11:28:02.453889 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 11:28:02 crc kubenswrapper[4807]: I1127 11:28:02.461545 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 27 11:28:03 crc kubenswrapper[4807]: I1127 11:28:03.086977 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 11:28:03 crc kubenswrapper[4807]: I1127 11:28:03.087292 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 27 11:28:04 crc kubenswrapper[4807]: I1127 11:28:04.949051 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 11:28:05 crc kubenswrapper[4807]: I1127 11:28:05.011299 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 27 11:28:05 crc kubenswrapper[4807]: I1127 11:28:05.939912 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.088358 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-combined-ca-bundle\") pod \"da707f4d-a2cd-426f-b524-874435ef409c\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.088401 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-scripts\") pod \"da707f4d-a2cd-426f-b524-874435ef409c\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.088505 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wv4d\" (UniqueName: \"kubernetes.io/projected/da707f4d-a2cd-426f-b524-874435ef409c-kube-api-access-7wv4d\") pod \"da707f4d-a2cd-426f-b524-874435ef409c\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.088635 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-public-tls-certs\") pod \"da707f4d-a2cd-426f-b524-874435ef409c\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.088660 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-config-data\") pod \"da707f4d-a2cd-426f-b524-874435ef409c\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.088681 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-logs\") pod \"da707f4d-a2cd-426f-b524-874435ef409c\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.088711 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-httpd-run\") pod \"da707f4d-a2cd-426f-b524-874435ef409c\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.088728 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"da707f4d-a2cd-426f-b524-874435ef409c\" (UID: \"da707f4d-a2cd-426f-b524-874435ef409c\") " Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.089645 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "da707f4d-a2cd-426f-b524-874435ef409c" (UID: "da707f4d-a2cd-426f-b524-874435ef409c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.090083 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-logs" (OuterVolumeSpecName: "logs") pod "da707f4d-a2cd-426f-b524-874435ef409c" (UID: "da707f4d-a2cd-426f-b524-874435ef409c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.098368 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-scripts" (OuterVolumeSpecName: "scripts") pod "da707f4d-a2cd-426f-b524-874435ef409c" (UID: "da707f4d-a2cd-426f-b524-874435ef409c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.101399 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "da707f4d-a2cd-426f-b524-874435ef409c" (UID: "da707f4d-a2cd-426f-b524-874435ef409c"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.101422 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da707f4d-a2cd-426f-b524-874435ef409c-kube-api-access-7wv4d" (OuterVolumeSpecName: "kube-api-access-7wv4d") pod "da707f4d-a2cd-426f-b524-874435ef409c" (UID: "da707f4d-a2cd-426f-b524-874435ef409c"). InnerVolumeSpecName "kube-api-access-7wv4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.126835 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cwjzz" event={"ID":"5650a6a6-37ed-42ec-812a-f94eb9a92117","Type":"ContainerStarted","Data":"90ff2d035d4aa231be37e6f060e39aeb77c62c088c7003f66bcc5e904da8b60b"} Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.134760 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.135170 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"da707f4d-a2cd-426f-b524-874435ef409c","Type":"ContainerDied","Data":"d3fdbd4879dd3b2a6f4f924d37802339a97014a62f6fa91f650a841abae24e27"} Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.135310 4807 scope.go:117] "RemoveContainer" containerID="edd43dc250c0190e440bb59c0b21000ef935603af1c0356ed90b85b963aa250e" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.149812 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cwjzz" podStartSLOduration=1.608999731 podStartE2EDuration="8.149795602s" podCreationTimestamp="2025-11-27 11:27:58 +0000 UTC" firstStartedPulling="2025-11-27 11:27:58.960059737 +0000 UTC m=+1120.059557925" lastFinishedPulling="2025-11-27 11:28:05.500855598 +0000 UTC m=+1126.600353796" observedRunningTime="2025-11-27 11:28:06.143769085 +0000 UTC m=+1127.243267283" watchObservedRunningTime="2025-11-27 11:28:06.149795602 +0000 UTC m=+1127.249293800" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.150871 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da707f4d-a2cd-426f-b524-874435ef409c" (UID: "da707f4d-a2cd-426f-b524-874435ef409c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.165611 4807 scope.go:117] "RemoveContainer" containerID="ad157d88570ad9ad10dd3f21af3b4681272f81ca4c53bccd1cecec85ce333532" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.181886 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-config-data" (OuterVolumeSpecName: "config-data") pod "da707f4d-a2cd-426f-b524-874435ef409c" (UID: "da707f4d-a2cd-426f-b524-874435ef409c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.185404 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "da707f4d-a2cd-426f-b524-874435ef409c" (UID: "da707f4d-a2cd-426f-b524-874435ef409c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.190733 4807 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.190767 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.190776 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.190800 4807 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/da707f4d-a2cd-426f-b524-874435ef409c-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.190828 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.190838 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.191025 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da707f4d-a2cd-426f-b524-874435ef409c-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.191039 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wv4d\" (UniqueName: \"kubernetes.io/projected/da707f4d-a2cd-426f-b524-874435ef409c-kube-api-access-7wv4d\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.213660 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.292721 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.475220 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.493698 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.505586 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:28:06 crc kubenswrapper[4807]: E1127 11:28:06.506032 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon-log" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.506056 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon-log" Nov 27 11:28:06 crc kubenswrapper[4807]: E1127 11:28:06.506087 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da707f4d-a2cd-426f-b524-874435ef409c" containerName="glance-log" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.506096 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="da707f4d-a2cd-426f-b524-874435ef409c" containerName="glance-log" Nov 27 11:28:06 crc kubenswrapper[4807]: E1127 11:28:06.506136 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da707f4d-a2cd-426f-b524-874435ef409c" containerName="glance-httpd" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.506146 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="da707f4d-a2cd-426f-b524-874435ef409c" containerName="glance-httpd" Nov 27 11:28:06 crc kubenswrapper[4807]: E1127 11:28:06.506160 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.506166 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.506390 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon-log" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.506415 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="da707f4d-a2cd-426f-b524-874435ef409c" containerName="glance-httpd" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.506429 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="972db85e-5d7f-4312-b2c1-36f3c4e697d3" containerName="horizon" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.506448 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="da707f4d-a2cd-426f-b524-874435ef409c" containerName="glance-log" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.507559 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.509540 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.510493 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.533482 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.598043 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.598369 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.598394 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.598486 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.598521 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.598553 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a059d86-8a32-481a-80c7-e9675cb921b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.598609 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bttp4\" (UniqueName: \"kubernetes.io/projected/0a059d86-8a32-481a-80c7-e9675cb921b9-kube-api-access-bttp4\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.598679 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a059d86-8a32-481a-80c7-e9675cb921b9-logs\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.699889 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.699957 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.699990 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a059d86-8a32-481a-80c7-e9675cb921b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.700026 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bttp4\" (UniqueName: \"kubernetes.io/projected/0a059d86-8a32-481a-80c7-e9675cb921b9-kube-api-access-bttp4\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.700063 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a059d86-8a32-481a-80c7-e9675cb921b9-logs\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.700080 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.700108 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.700129 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.700547 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a059d86-8a32-481a-80c7-e9675cb921b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.700817 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a059d86-8a32-481a-80c7-e9675cb921b9-logs\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.701134 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.705233 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.711699 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.711902 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.716116 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a059d86-8a32-481a-80c7-e9675cb921b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.720796 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bttp4\" (UniqueName: \"kubernetes.io/projected/0a059d86-8a32-481a-80c7-e9675cb921b9-kube-api-access-bttp4\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.731241 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0a059d86-8a32-481a-80c7-e9675cb921b9\") " pod="openstack/glance-default-external-api-0" Nov 27 11:28:06 crc kubenswrapper[4807]: I1127 11:28:06.834616 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 27 11:28:07 crc kubenswrapper[4807]: I1127 11:28:07.367026 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 27 11:28:07 crc kubenswrapper[4807]: W1127 11:28:07.373883 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a059d86_8a32_481a_80c7_e9675cb921b9.slice/crio-3afec722e21f899fbc40d851afef37dc671ab33329e53a1f0866ea5333ed41ca WatchSource:0}: Error finding container 3afec722e21f899fbc40d851afef37dc671ab33329e53a1f0866ea5333ed41ca: Status 404 returned error can't find the container with id 3afec722e21f899fbc40d851afef37dc671ab33329e53a1f0866ea5333ed41ca Nov 27 11:28:07 crc kubenswrapper[4807]: I1127 11:28:07.544656 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da707f4d-a2cd-426f-b524-874435ef409c" path="/var/lib/kubelet/pods/da707f4d-a2cd-426f-b524-874435ef409c/volumes" Nov 27 11:28:08 crc kubenswrapper[4807]: I1127 11:28:08.153593 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a059d86-8a32-481a-80c7-e9675cb921b9","Type":"ContainerStarted","Data":"4afdb38bdc59d1bddec7aeb0a1f41c15071020eb16f605f4cff3492daca8ac11"} Nov 27 11:28:08 crc kubenswrapper[4807]: I1127 11:28:08.154381 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a059d86-8a32-481a-80c7-e9675cb921b9","Type":"ContainerStarted","Data":"3afec722e21f899fbc40d851afef37dc671ab33329e53a1f0866ea5333ed41ca"} Nov 27 11:28:09 crc kubenswrapper[4807]: I1127 11:28:09.163993 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a059d86-8a32-481a-80c7-e9675cb921b9","Type":"ContainerStarted","Data":"a22e122820055211fe78fc6caf1cd33daa96dd7a16303d87b554d492d7546062"} Nov 27 11:28:16 crc kubenswrapper[4807]: I1127 11:28:16.233106 4807 generic.go:334] "Generic (PLEG): container finished" podID="5650a6a6-37ed-42ec-812a-f94eb9a92117" containerID="90ff2d035d4aa231be37e6f060e39aeb77c62c088c7003f66bcc5e904da8b60b" exitCode=0 Nov 27 11:28:16 crc kubenswrapper[4807]: I1127 11:28:16.233196 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cwjzz" event={"ID":"5650a6a6-37ed-42ec-812a-f94eb9a92117","Type":"ContainerDied","Data":"90ff2d035d4aa231be37e6f060e39aeb77c62c088c7003f66bcc5e904da8b60b"} Nov 27 11:28:16 crc kubenswrapper[4807]: I1127 11:28:16.255122 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.255104901 podStartE2EDuration="10.255104901s" podCreationTimestamp="2025-11-27 11:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:28:09.193012982 +0000 UTC m=+1130.292511190" watchObservedRunningTime="2025-11-27 11:28:16.255104901 +0000 UTC m=+1137.354603099" Nov 27 11:28:16 crc kubenswrapper[4807]: I1127 11:28:16.835452 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 11:28:16 crc kubenswrapper[4807]: I1127 11:28:16.835485 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 27 11:28:16 crc kubenswrapper[4807]: I1127 11:28:16.871821 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 11:28:16 crc kubenswrapper[4807]: I1127 11:28:16.885189 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.241969 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.242022 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.571599 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.608875 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-scripts\") pod \"5650a6a6-37ed-42ec-812a-f94eb9a92117\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.609352 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvh9h\" (UniqueName: \"kubernetes.io/projected/5650a6a6-37ed-42ec-812a-f94eb9a92117-kube-api-access-gvh9h\") pod \"5650a6a6-37ed-42ec-812a-f94eb9a92117\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.609509 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-combined-ca-bundle\") pod \"5650a6a6-37ed-42ec-812a-f94eb9a92117\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.609576 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-config-data\") pod \"5650a6a6-37ed-42ec-812a-f94eb9a92117\" (UID: \"5650a6a6-37ed-42ec-812a-f94eb9a92117\") " Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.614962 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-scripts" (OuterVolumeSpecName: "scripts") pod "5650a6a6-37ed-42ec-812a-f94eb9a92117" (UID: "5650a6a6-37ed-42ec-812a-f94eb9a92117"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.626732 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5650a6a6-37ed-42ec-812a-f94eb9a92117-kube-api-access-gvh9h" (OuterVolumeSpecName: "kube-api-access-gvh9h") pod "5650a6a6-37ed-42ec-812a-f94eb9a92117" (UID: "5650a6a6-37ed-42ec-812a-f94eb9a92117"). InnerVolumeSpecName "kube-api-access-gvh9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.638191 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-config-data" (OuterVolumeSpecName: "config-data") pod "5650a6a6-37ed-42ec-812a-f94eb9a92117" (UID: "5650a6a6-37ed-42ec-812a-f94eb9a92117"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.661387 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5650a6a6-37ed-42ec-812a-f94eb9a92117" (UID: "5650a6a6-37ed-42ec-812a-f94eb9a92117"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.711853 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.711890 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.711899 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5650a6a6-37ed-42ec-812a-f94eb9a92117-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:17 crc kubenswrapper[4807]: I1127 11:28:17.711907 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvh9h\" (UniqueName: \"kubernetes.io/projected/5650a6a6-37ed-42ec-812a-f94eb9a92117-kube-api-access-gvh9h\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.130949 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.251350 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cwjzz" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.251346 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cwjzz" event={"ID":"5650a6a6-37ed-42ec-812a-f94eb9a92117","Type":"ContainerDied","Data":"c4deb76f4b87c9707a0729d6f7fc51e1565055bc26ca90e6229e6f395834e59a"} Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.251483 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4deb76f4b87c9707a0729d6f7fc51e1565055bc26ca90e6229e6f395834e59a" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.345343 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 27 11:28:18 crc kubenswrapper[4807]: E1127 11:28:18.345747 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5650a6a6-37ed-42ec-812a-f94eb9a92117" containerName="nova-cell0-conductor-db-sync" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.345764 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5650a6a6-37ed-42ec-812a-f94eb9a92117" containerName="nova-cell0-conductor-db-sync" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.345977 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5650a6a6-37ed-42ec-812a-f94eb9a92117" containerName="nova-cell0-conductor-db-sync" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.346641 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.350066 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-9p2sx" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.354655 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.355180 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.425754 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621dbc60-ba00-466f-8cbb-2e58611dff37-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"621dbc60-ba00-466f-8cbb-2e58611dff37\") " pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.425817 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92zxp\" (UniqueName: \"kubernetes.io/projected/621dbc60-ba00-466f-8cbb-2e58611dff37-kube-api-access-92zxp\") pod \"nova-cell0-conductor-0\" (UID: \"621dbc60-ba00-466f-8cbb-2e58611dff37\") " pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.425995 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621dbc60-ba00-466f-8cbb-2e58611dff37-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"621dbc60-ba00-466f-8cbb-2e58611dff37\") " pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.527233 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621dbc60-ba00-466f-8cbb-2e58611dff37-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"621dbc60-ba00-466f-8cbb-2e58611dff37\") " pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.527563 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621dbc60-ba00-466f-8cbb-2e58611dff37-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"621dbc60-ba00-466f-8cbb-2e58611dff37\") " pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.527729 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92zxp\" (UniqueName: \"kubernetes.io/projected/621dbc60-ba00-466f-8cbb-2e58611dff37-kube-api-access-92zxp\") pod \"nova-cell0-conductor-0\" (UID: \"621dbc60-ba00-466f-8cbb-2e58611dff37\") " pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.532180 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/621dbc60-ba00-466f-8cbb-2e58611dff37-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"621dbc60-ba00-466f-8cbb-2e58611dff37\") " pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.532859 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621dbc60-ba00-466f-8cbb-2e58611dff37-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"621dbc60-ba00-466f-8cbb-2e58611dff37\") " pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.556513 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92zxp\" (UniqueName: \"kubernetes.io/projected/621dbc60-ba00-466f-8cbb-2e58611dff37-kube-api-access-92zxp\") pod \"nova-cell0-conductor-0\" (UID: \"621dbc60-ba00-466f-8cbb-2e58611dff37\") " pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:18 crc kubenswrapper[4807]: I1127 11:28:18.662143 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:19 crc kubenswrapper[4807]: I1127 11:28:19.154699 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 27 11:28:19 crc kubenswrapper[4807]: I1127 11:28:19.259280 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"621dbc60-ba00-466f-8cbb-2e58611dff37","Type":"ContainerStarted","Data":"b4f03bec9741dcc16db3173347f2d54de3827f527c6550a98d00077019bda8bd"} Nov 27 11:28:19 crc kubenswrapper[4807]: I1127 11:28:19.260747 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 11:28:19 crc kubenswrapper[4807]: I1127 11:28:19.260848 4807 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 27 11:28:19 crc kubenswrapper[4807]: I1127 11:28:19.271957 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 27 11:28:20 crc kubenswrapper[4807]: I1127 11:28:20.270409 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"621dbc60-ba00-466f-8cbb-2e58611dff37","Type":"ContainerStarted","Data":"93136f91632438c37c395fc7ccc8feccc59c169f440d7059138e046125ccae7f"} Nov 27 11:28:20 crc kubenswrapper[4807]: I1127 11:28:20.270911 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:20 crc kubenswrapper[4807]: I1127 11:28:20.287004 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.286984447 podStartE2EDuration="2.286984447s" podCreationTimestamp="2025-11-27 11:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:28:20.283014633 +0000 UTC m=+1141.382512841" watchObservedRunningTime="2025-11-27 11:28:20.286984447 +0000 UTC m=+1141.386482645" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.249815 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.308431 4807 generic.go:334] "Generic (PLEG): container finished" podID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerID="75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c" exitCode=137 Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.308673 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f4c8710-cab5-4bf1-8bdd-86e6350e8058","Type":"ContainerDied","Data":"75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c"} Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.308750 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0f4c8710-cab5-4bf1-8bdd-86e6350e8058","Type":"ContainerDied","Data":"9998d6c53d95f028880a1e60c79eb33067b25f3fd37277a213b89ed84540c7e3"} Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.308827 4807 scope.go:117] "RemoveContainer" containerID="75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.308992 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.332869 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-config-data\") pod \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.332928 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-log-httpd\") pod \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.332972 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkxtr\" (UniqueName: \"kubernetes.io/projected/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-kube-api-access-xkxtr\") pod \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.333023 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-combined-ca-bundle\") pod \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.333062 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-run-httpd\") pod \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.333127 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-sg-core-conf-yaml\") pod \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.333223 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-scripts\") pod \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\" (UID: \"0f4c8710-cab5-4bf1-8bdd-86e6350e8058\") " Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.334446 4807 scope.go:117] "RemoveContainer" containerID="149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.334842 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0f4c8710-cab5-4bf1-8bdd-86e6350e8058" (UID: "0f4c8710-cab5-4bf1-8bdd-86e6350e8058"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.335480 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0f4c8710-cab5-4bf1-8bdd-86e6350e8058" (UID: "0f4c8710-cab5-4bf1-8bdd-86e6350e8058"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.339003 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-kube-api-access-xkxtr" (OuterVolumeSpecName: "kube-api-access-xkxtr") pod "0f4c8710-cab5-4bf1-8bdd-86e6350e8058" (UID: "0f4c8710-cab5-4bf1-8bdd-86e6350e8058"). InnerVolumeSpecName "kube-api-access-xkxtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.339078 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-scripts" (OuterVolumeSpecName: "scripts") pod "0f4c8710-cab5-4bf1-8bdd-86e6350e8058" (UID: "0f4c8710-cab5-4bf1-8bdd-86e6350e8058"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.380229 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0f4c8710-cab5-4bf1-8bdd-86e6350e8058" (UID: "0f4c8710-cab5-4bf1-8bdd-86e6350e8058"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.414045 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f4c8710-cab5-4bf1-8bdd-86e6350e8058" (UID: "0f4c8710-cab5-4bf1-8bdd-86e6350e8058"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.439642 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.439679 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.439692 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.439707 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkxtr\" (UniqueName: \"kubernetes.io/projected/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-kube-api-access-xkxtr\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.439723 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.439705 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-config-data" (OuterVolumeSpecName: "config-data") pod "0f4c8710-cab5-4bf1-8bdd-86e6350e8058" (UID: "0f4c8710-cab5-4bf1-8bdd-86e6350e8058"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.439735 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.458077 4807 scope.go:117] "RemoveContainer" containerID="8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.477741 4807 scope.go:117] "RemoveContainer" containerID="5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.505407 4807 scope.go:117] "RemoveContainer" containerID="75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c" Nov 27 11:28:24 crc kubenswrapper[4807]: E1127 11:28:24.506082 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c\": container with ID starting with 75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c not found: ID does not exist" containerID="75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.506146 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c"} err="failed to get container status \"75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c\": rpc error: code = NotFound desc = could not find container \"75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c\": container with ID starting with 75570ec250347d91caef7e277ed2a6a1f28d0263bc5cb9cbd917ebfce447dc0c not found: ID does not exist" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.506179 4807 scope.go:117] "RemoveContainer" containerID="149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c" Nov 27 11:28:24 crc kubenswrapper[4807]: E1127 11:28:24.506550 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c\": container with ID starting with 149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c not found: ID does not exist" containerID="149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.506586 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c"} err="failed to get container status \"149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c\": rpc error: code = NotFound desc = could not find container \"149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c\": container with ID starting with 149cf5085c74a35f044da2d633419d3b6d777c039955da32466803b43527f85c not found: ID does not exist" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.506616 4807 scope.go:117] "RemoveContainer" containerID="8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30" Nov 27 11:28:24 crc kubenswrapper[4807]: E1127 11:28:24.507055 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30\": container with ID starting with 8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30 not found: ID does not exist" containerID="8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.507088 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30"} err="failed to get container status \"8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30\": rpc error: code = NotFound desc = could not find container \"8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30\": container with ID starting with 8023000156d97e1e14ea3cfdecc4769d87ac45c20305e073611f2453d4283d30 not found: ID does not exist" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.507109 4807 scope.go:117] "RemoveContainer" containerID="5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288" Nov 27 11:28:24 crc kubenswrapper[4807]: E1127 11:28:24.507413 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288\": container with ID starting with 5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288 not found: ID does not exist" containerID="5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.507449 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288"} err="failed to get container status \"5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288\": rpc error: code = NotFound desc = could not find container \"5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288\": container with ID starting with 5671143180d9375e881250f0827afefd8468a1a07d89b4cd65cb23c88860d288 not found: ID does not exist" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.541191 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f4c8710-cab5-4bf1-8bdd-86e6350e8058-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.647059 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.662228 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.672214 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:28:24 crc kubenswrapper[4807]: E1127 11:28:24.672722 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="proxy-httpd" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.672745 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="proxy-httpd" Nov 27 11:28:24 crc kubenswrapper[4807]: E1127 11:28:24.672762 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="ceilometer-central-agent" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.672771 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="ceilometer-central-agent" Nov 27 11:28:24 crc kubenswrapper[4807]: E1127 11:28:24.673644 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="sg-core" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.673671 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="sg-core" Nov 27 11:28:24 crc kubenswrapper[4807]: E1127 11:28:24.673691 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="ceilometer-notification-agent" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.673700 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="ceilometer-notification-agent" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.673992 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="ceilometer-notification-agent" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.674024 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="proxy-httpd" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.674044 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="ceilometer-central-agent" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.674067 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" containerName="sg-core" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.675962 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.684866 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.694590 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.700700 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.846405 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-log-httpd\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.846448 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.846474 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-scripts\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.846539 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b4h4\" (UniqueName: \"kubernetes.io/projected/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-kube-api-access-8b4h4\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.846556 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.846586 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-run-httpd\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.846606 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-config-data\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.947870 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b4h4\" (UniqueName: \"kubernetes.io/projected/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-kube-api-access-8b4h4\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.948150 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.948190 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-run-httpd\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.948218 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-config-data\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.948305 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-log-httpd\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.948322 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.948348 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-scripts\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.949315 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-run-httpd\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.950035 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-log-httpd\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.952148 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-scripts\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.952524 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-config-data\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.952796 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.952908 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:24 crc kubenswrapper[4807]: I1127 11:28:24.964551 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b4h4\" (UniqueName: \"kubernetes.io/projected/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-kube-api-access-8b4h4\") pod \"ceilometer-0\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " pod="openstack/ceilometer-0" Nov 27 11:28:25 crc kubenswrapper[4807]: I1127 11:28:25.016482 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:28:25 crc kubenswrapper[4807]: I1127 11:28:25.474405 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:28:25 crc kubenswrapper[4807]: W1127 11:28:25.475709 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d09b001_7ffd_4b2c_9ab0_3b53e54c6491.slice/crio-f868ed3e5322ee8abd9fd7bb6382e65c3b8372e5c6501e3618b2dfa8c946bfec WatchSource:0}: Error finding container f868ed3e5322ee8abd9fd7bb6382e65c3b8372e5c6501e3618b2dfa8c946bfec: Status 404 returned error can't find the container with id f868ed3e5322ee8abd9fd7bb6382e65c3b8372e5c6501e3618b2dfa8c946bfec Nov 27 11:28:25 crc kubenswrapper[4807]: I1127 11:28:25.545696 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f4c8710-cab5-4bf1-8bdd-86e6350e8058" path="/var/lib/kubelet/pods/0f4c8710-cab5-4bf1-8bdd-86e6350e8058/volumes" Nov 27 11:28:26 crc kubenswrapper[4807]: I1127 11:28:26.326646 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491","Type":"ContainerStarted","Data":"f868ed3e5322ee8abd9fd7bb6382e65c3b8372e5c6501e3618b2dfa8c946bfec"} Nov 27 11:28:27 crc kubenswrapper[4807]: I1127 11:28:27.348052 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491","Type":"ContainerStarted","Data":"b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c"} Nov 27 11:28:28 crc kubenswrapper[4807]: I1127 11:28:28.357254 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491","Type":"ContainerStarted","Data":"c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07"} Nov 27 11:28:28 crc kubenswrapper[4807]: I1127 11:28:28.357767 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491","Type":"ContainerStarted","Data":"8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24"} Nov 27 11:28:28 crc kubenswrapper[4807]: I1127 11:28:28.686139 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.170205 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-mrj5n"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.171586 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.173993 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.176715 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.202013 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mrj5n"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.322186 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-config-data\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.322319 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpchd\" (UniqueName: \"kubernetes.io/projected/d88895da-e0b3-40c7-82bf-eb68882e01cd-kube-api-access-vpchd\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.322369 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.322394 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-scripts\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.357234 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.358796 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.363898 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.372727 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.373885 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.377071 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.389309 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.400305 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.401691 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.407658 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.424748 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpchd\" (UniqueName: \"kubernetes.io/projected/d88895da-e0b3-40c7-82bf-eb68882e01cd-kube-api-access-vpchd\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.424820 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.424850 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-scripts\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.424908 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-config-data\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.439921 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-scripts\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.442754 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.459272 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-config-data\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.470821 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.502815 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpchd\" (UniqueName: \"kubernetes.io/projected/d88895da-e0b3-40c7-82bf-eb68882e01cd-kube-api-access-vpchd\") pod \"nova-cell0-cell-mapping-mrj5n\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.524929 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.525878 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jggdw\" (UniqueName: \"kubernetes.io/projected/d55c3c6e-766e-4471-8c37-6603fb137822-kube-api-access-jggdw\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.525907 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.525952 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-config-data\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.525975 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjpph\" (UniqueName: \"kubernetes.io/projected/b38c64fe-fd37-446f-a0fc-e110a5904b22-kube-api-access-sjpph\") pod \"nova-cell1-novncproxy-0\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.525995 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.526030 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.526043 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.526101 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-logs\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.526128 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cg88\" (UniqueName: \"kubernetes.io/projected/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-kube-api-access-9cg88\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.526141 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d55c3c6e-766e-4471-8c37-6603fb137822-logs\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.526165 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-config-data\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628053 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-config-data\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628116 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjpph\" (UniqueName: \"kubernetes.io/projected/b38c64fe-fd37-446f-a0fc-e110a5904b22-kube-api-access-sjpph\") pod \"nova-cell1-novncproxy-0\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628147 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628212 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628237 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628364 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-logs\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628400 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cg88\" (UniqueName: \"kubernetes.io/projected/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-kube-api-access-9cg88\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628425 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d55c3c6e-766e-4471-8c37-6603fb137822-logs\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628459 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-config-data\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628493 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jggdw\" (UniqueName: \"kubernetes.io/projected/d55c3c6e-766e-4471-8c37-6603fb137822-kube-api-access-jggdw\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.628702 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.632044 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-logs\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.632811 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-config-data\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.633231 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-config-data\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.635108 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d55c3c6e-766e-4471-8c37-6603fb137822-logs\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.639180 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qvc29"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.639764 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.641534 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.647941 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.648017 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.652543 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.665870 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cg88\" (UniqueName: \"kubernetes.io/projected/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-kube-api-access-9cg88\") pod \"nova-api-0\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.666728 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jggdw\" (UniqueName: \"kubernetes.io/projected/d55c3c6e-766e-4471-8c37-6603fb137822-kube-api-access-jggdw\") pod \"nova-metadata-0\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.670351 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjpph\" (UniqueName: \"kubernetes.io/projected/b38c64fe-fd37-446f-a0fc-e110a5904b22-kube-api-access-sjpph\") pod \"nova-cell1-novncproxy-0\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.678437 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.679605 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.683056 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.687828 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.706322 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qvc29"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.717009 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.731525 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.731622 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-config\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.750140 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.752452 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.752596 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.752873 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mncmg\" (UniqueName: \"kubernetes.io/projected/4862f897-1024-446a-bc1c-9b8c9c8e7792-kube-api-access-mncmg\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.752968 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-svc\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.799068 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.854543 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-config\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.854583 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.854637 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.854697 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-config-data\") pod \"nova-scheduler-0\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.854722 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd8dd\" (UniqueName: \"kubernetes.io/projected/d3592034-67db-49ff-80ff-c3991a5dbaf7-kube-api-access-wd8dd\") pod \"nova-scheduler-0\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.854754 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mncmg\" (UniqueName: \"kubernetes.io/projected/4862f897-1024-446a-bc1c-9b8c9c8e7792-kube-api-access-mncmg\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.854800 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.854821 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-svc\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.854846 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.855605 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.856710 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.857641 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.858864 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-svc\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.865409 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-config\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.878864 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mncmg\" (UniqueName: \"kubernetes.io/projected/4862f897-1024-446a-bc1c-9b8c9c8e7792-kube-api-access-mncmg\") pod \"dnsmasq-dns-757b4f8459-qvc29\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.955879 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.957956 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.958086 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-config-data\") pod \"nova-scheduler-0\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.958117 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd8dd\" (UniqueName: \"kubernetes.io/projected/d3592034-67db-49ff-80ff-c3991a5dbaf7-kube-api-access-wd8dd\") pod \"nova-scheduler-0\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.961886 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.962942 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-config-data\") pod \"nova-scheduler-0\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:29 crc kubenswrapper[4807]: I1127 11:28:29.995277 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd8dd\" (UniqueName: \"kubernetes.io/projected/d3592034-67db-49ff-80ff-c3991a5dbaf7-kube-api-access-wd8dd\") pod \"nova-scheduler-0\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.001856 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.015498 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 11:28:30 crc kubenswrapper[4807]: W1127 11:28:30.239407 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd55c3c6e_766e_4471_8c37_6603fb137822.slice/crio-b7e4857469c7f6e4a3e28fa5d84f8ad8e2d73fb3bcb286611ac9332734d6e864 WatchSource:0}: Error finding container b7e4857469c7f6e4a3e28fa5d84f8ad8e2d73fb3bcb286611ac9332734d6e864: Status 404 returned error can't find the container with id b7e4857469c7f6e4a3e28fa5d84f8ad8e2d73fb3bcb286611ac9332734d6e864 Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.240966 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:30 crc kubenswrapper[4807]: W1127 11:28:30.380536 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb91123e_7df5_46ff_bd9c_68eb9cd8f76a.slice/crio-14bf94e0865a4848cfbeaa376a1aa2df6b17af3a1fdd24f735210f0f24e9ff00 WatchSource:0}: Error finding container 14bf94e0865a4848cfbeaa376a1aa2df6b17af3a1fdd24f735210f0f24e9ff00: Status 404 returned error can't find the container with id 14bf94e0865a4848cfbeaa376a1aa2df6b17af3a1fdd24f735210f0f24e9ff00 Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.381362 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.386063 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55c3c6e-766e-4471-8c37-6603fb137822","Type":"ContainerStarted","Data":"b7e4857469c7f6e4a3e28fa5d84f8ad8e2d73fb3bcb286611ac9332734d6e864"} Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.387670 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491","Type":"ContainerStarted","Data":"d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2"} Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.389231 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.416818 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.342550702 podStartE2EDuration="6.416527989s" podCreationTimestamp="2025-11-27 11:28:24 +0000 UTC" firstStartedPulling="2025-11-27 11:28:25.477919927 +0000 UTC m=+1146.577418135" lastFinishedPulling="2025-11-27 11:28:29.551897224 +0000 UTC m=+1150.651395422" observedRunningTime="2025-11-27 11:28:30.413624003 +0000 UTC m=+1151.513122201" watchObservedRunningTime="2025-11-27 11:28:30.416527989 +0000 UTC m=+1151.516026187" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.484435 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-mrj5n"] Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.520789 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mznhj"] Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.522874 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.530488 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.530671 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.545339 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mznhj"] Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.594446 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.594514 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-scripts\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.594540 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-config-data\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.594598 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flx89\" (UniqueName: \"kubernetes.io/projected/9f6dc415-3063-48ab-8a84-27a041b110f4-kube-api-access-flx89\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: W1127 11:28:30.594665 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4862f897_1024_446a_bc1c_9b8c9c8e7792.slice/crio-0e1c32ac745999a6ac664e30ea3ac386f054dded4e56a31a6acb09e9eb001935 WatchSource:0}: Error finding container 0e1c32ac745999a6ac664e30ea3ac386f054dded4e56a31a6acb09e9eb001935: Status 404 returned error can't find the container with id 0e1c32ac745999a6ac664e30ea3ac386f054dded4e56a31a6acb09e9eb001935 Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.597295 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.617822 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qvc29"] Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.683259 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.696284 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-scripts\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.696470 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-config-data\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.696707 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flx89\" (UniqueName: \"kubernetes.io/projected/9f6dc415-3063-48ab-8a84-27a041b110f4-kube-api-access-flx89\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.696896 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.702827 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-config-data\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.703299 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.703385 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-scripts\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: W1127 11:28:30.704696 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3592034_67db_49ff_80ff_c3991a5dbaf7.slice/crio-2f88a0a892065e96a1c9b00db127ba2a4b3a14a3e386f54c02e420e0018b431d WatchSource:0}: Error finding container 2f88a0a892065e96a1c9b00db127ba2a4b3a14a3e386f54c02e420e0018b431d: Status 404 returned error can't find the container with id 2f88a0a892065e96a1c9b00db127ba2a4b3a14a3e386f54c02e420e0018b431d Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.720481 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flx89\" (UniqueName: \"kubernetes.io/projected/9f6dc415-3063-48ab-8a84-27a041b110f4-kube-api-access-flx89\") pod \"nova-cell1-conductor-db-sync-mznhj\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:30 crc kubenswrapper[4807]: I1127 11:28:30.722501 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.158304 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mznhj"] Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.403840 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a","Type":"ContainerStarted","Data":"14bf94e0865a4848cfbeaa376a1aa2df6b17af3a1fdd24f735210f0f24e9ff00"} Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.407693 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3592034-67db-49ff-80ff-c3991a5dbaf7","Type":"ContainerStarted","Data":"2f88a0a892065e96a1c9b00db127ba2a4b3a14a3e386f54c02e420e0018b431d"} Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.413539 4807 generic.go:334] "Generic (PLEG): container finished" podID="4862f897-1024-446a-bc1c-9b8c9c8e7792" containerID="20cec08b68acb488954f0cd2e2f3dc2aa597baeadf9aae28d7edda1d29d5c51a" exitCode=0 Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.413595 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" event={"ID":"4862f897-1024-446a-bc1c-9b8c9c8e7792","Type":"ContainerDied","Data":"20cec08b68acb488954f0cd2e2f3dc2aa597baeadf9aae28d7edda1d29d5c51a"} Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.413654 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" event={"ID":"4862f897-1024-446a-bc1c-9b8c9c8e7792","Type":"ContainerStarted","Data":"0e1c32ac745999a6ac664e30ea3ac386f054dded4e56a31a6acb09e9eb001935"} Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.419572 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mznhj" event={"ID":"9f6dc415-3063-48ab-8a84-27a041b110f4","Type":"ContainerStarted","Data":"d6188fb02897da9f75290e681f1df02212cf0a1b2f284fd917be65d34e698154"} Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.419610 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mznhj" event={"ID":"9f6dc415-3063-48ab-8a84-27a041b110f4","Type":"ContainerStarted","Data":"37ed51d24f8c5e77bdf66a5a66ba99e9f87c7ba9df1071fe68f39813e2448502"} Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.441782 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b38c64fe-fd37-446f-a0fc-e110a5904b22","Type":"ContainerStarted","Data":"7017aee7e12d0bae7e84f545ce509cd32112c0fc54420c8e7e63dca09c64ec62"} Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.464155 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mrj5n" event={"ID":"d88895da-e0b3-40c7-82bf-eb68882e01cd","Type":"ContainerStarted","Data":"3a8f63156e2a97220998270496878d887c1bf32ef7b88dbac31e38b555e00e8c"} Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.464193 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mrj5n" event={"ID":"d88895da-e0b3-40c7-82bf-eb68882e01cd","Type":"ContainerStarted","Data":"48122bfef2c354c2c3f50a77f3f7f57a360a0e41745157f496160debe6b4f8f3"} Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.589575 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-mznhj" podStartSLOduration=1.589555375 podStartE2EDuration="1.589555375s" podCreationTimestamp="2025-11-27 11:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:28:31.503952927 +0000 UTC m=+1152.603451125" watchObservedRunningTime="2025-11-27 11:28:31.589555375 +0000 UTC m=+1152.689053563" Nov 27 11:28:31 crc kubenswrapper[4807]: I1127 11:28:31.599026 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-mrj5n" podStartSLOduration=2.599009773 podStartE2EDuration="2.599009773s" podCreationTimestamp="2025-11-27 11:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:28:31.548665576 +0000 UTC m=+1152.648163774" watchObservedRunningTime="2025-11-27 11:28:31.599009773 +0000 UTC m=+1152.698507971" Nov 27 11:28:32 crc kubenswrapper[4807]: I1127 11:28:32.508024 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" event={"ID":"4862f897-1024-446a-bc1c-9b8c9c8e7792","Type":"ContainerStarted","Data":"e53072ecd3e796bc06b4588e2133c288edd5bec0d412c409df34c7db9aec7b44"} Nov 27 11:28:32 crc kubenswrapper[4807]: I1127 11:28:32.509492 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:32 crc kubenswrapper[4807]: I1127 11:28:32.538113 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" podStartSLOduration=3.538096794 podStartE2EDuration="3.538096794s" podCreationTimestamp="2025-11-27 11:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:28:32.530760872 +0000 UTC m=+1153.630259070" watchObservedRunningTime="2025-11-27 11:28:32.538096794 +0000 UTC m=+1153.637594992" Nov 27 11:28:33 crc kubenswrapper[4807]: I1127 11:28:33.713163 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:33 crc kubenswrapper[4807]: I1127 11:28:33.759842 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.535204 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b38c64fe-fd37-446f-a0fc-e110a5904b22" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad" gracePeriod=30 Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.538890 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d55c3c6e-766e-4471-8c37-6603fb137822" containerName="nova-metadata-log" containerID="cri-o://3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c" gracePeriod=30 Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.538938 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d55c3c6e-766e-4471-8c37-6603fb137822" containerName="nova-metadata-metadata" containerID="cri-o://4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451" gracePeriod=30 Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.544976 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b38c64fe-fd37-446f-a0fc-e110a5904b22","Type":"ContainerStarted","Data":"c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad"} Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.545019 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55c3c6e-766e-4471-8c37-6603fb137822","Type":"ContainerStarted","Data":"4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451"} Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.545032 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55c3c6e-766e-4471-8c37-6603fb137822","Type":"ContainerStarted","Data":"3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c"} Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.545561 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a","Type":"ContainerStarted","Data":"16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc"} Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.545584 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a","Type":"ContainerStarted","Data":"bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283"} Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.547328 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3592034-67db-49ff-80ff-c3991a5dbaf7","Type":"ContainerStarted","Data":"a81a1e9bd997da7ef7e9ef645aceeea29381656939624d7907502d48170b4d58"} Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.560154 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.873219443 podStartE2EDuration="6.56013838s" podCreationTimestamp="2025-11-27 11:28:29 +0000 UTC" firstStartedPulling="2025-11-27 11:28:30.584050409 +0000 UTC m=+1151.683548607" lastFinishedPulling="2025-11-27 11:28:34.270969346 +0000 UTC m=+1155.370467544" observedRunningTime="2025-11-27 11:28:35.557671216 +0000 UTC m=+1156.657169414" watchObservedRunningTime="2025-11-27 11:28:35.56013838 +0000 UTC m=+1156.659636578" Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.578463 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.723636372 podStartE2EDuration="6.578446219s" podCreationTimestamp="2025-11-27 11:28:29 +0000 UTC" firstStartedPulling="2025-11-27 11:28:30.414545937 +0000 UTC m=+1151.514044135" lastFinishedPulling="2025-11-27 11:28:34.269355784 +0000 UTC m=+1155.368853982" observedRunningTime="2025-11-27 11:28:35.571749634 +0000 UTC m=+1156.671247832" watchObservedRunningTime="2025-11-27 11:28:35.578446219 +0000 UTC m=+1156.677944417" Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.589215 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.022223527 podStartE2EDuration="6.58919644s" podCreationTimestamp="2025-11-27 11:28:29 +0000 UTC" firstStartedPulling="2025-11-27 11:28:30.707392612 +0000 UTC m=+1151.806890810" lastFinishedPulling="2025-11-27 11:28:34.274365525 +0000 UTC m=+1155.373863723" observedRunningTime="2025-11-27 11:28:35.588784779 +0000 UTC m=+1156.688282987" watchObservedRunningTime="2025-11-27 11:28:35.58919644 +0000 UTC m=+1156.688694638" Nov 27 11:28:35 crc kubenswrapper[4807]: I1127 11:28:35.615525 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.588590271 podStartE2EDuration="6.615508348s" podCreationTimestamp="2025-11-27 11:28:29 +0000 UTC" firstStartedPulling="2025-11-27 11:28:30.243496885 +0000 UTC m=+1151.342995083" lastFinishedPulling="2025-11-27 11:28:34.270414962 +0000 UTC m=+1155.369913160" observedRunningTime="2025-11-27 11:28:35.60680904 +0000 UTC m=+1156.706307248" watchObservedRunningTime="2025-11-27 11:28:35.615508348 +0000 UTC m=+1156.715006546" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.145722 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.301814 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jggdw\" (UniqueName: \"kubernetes.io/projected/d55c3c6e-766e-4471-8c37-6603fb137822-kube-api-access-jggdw\") pod \"d55c3c6e-766e-4471-8c37-6603fb137822\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.301868 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-combined-ca-bundle\") pod \"d55c3c6e-766e-4471-8c37-6603fb137822\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.301998 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-config-data\") pod \"d55c3c6e-766e-4471-8c37-6603fb137822\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.302151 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d55c3c6e-766e-4471-8c37-6603fb137822-logs\") pod \"d55c3c6e-766e-4471-8c37-6603fb137822\" (UID: \"d55c3c6e-766e-4471-8c37-6603fb137822\") " Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.303160 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d55c3c6e-766e-4471-8c37-6603fb137822-logs" (OuterVolumeSpecName: "logs") pod "d55c3c6e-766e-4471-8c37-6603fb137822" (UID: "d55c3c6e-766e-4471-8c37-6603fb137822"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.303631 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d55c3c6e-766e-4471-8c37-6603fb137822-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.307809 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d55c3c6e-766e-4471-8c37-6603fb137822-kube-api-access-jggdw" (OuterVolumeSpecName: "kube-api-access-jggdw") pod "d55c3c6e-766e-4471-8c37-6603fb137822" (UID: "d55c3c6e-766e-4471-8c37-6603fb137822"). InnerVolumeSpecName "kube-api-access-jggdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.329776 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-config-data" (OuterVolumeSpecName: "config-data") pod "d55c3c6e-766e-4471-8c37-6603fb137822" (UID: "d55c3c6e-766e-4471-8c37-6603fb137822"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.331398 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d55c3c6e-766e-4471-8c37-6603fb137822" (UID: "d55c3c6e-766e-4471-8c37-6603fb137822"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.406789 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.406824 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jggdw\" (UniqueName: \"kubernetes.io/projected/d55c3c6e-766e-4471-8c37-6603fb137822-kube-api-access-jggdw\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.406835 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d55c3c6e-766e-4471-8c37-6603fb137822-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.558943 4807 generic.go:334] "Generic (PLEG): container finished" podID="d55c3c6e-766e-4471-8c37-6603fb137822" containerID="4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451" exitCode=0 Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.558975 4807 generic.go:334] "Generic (PLEG): container finished" podID="d55c3c6e-766e-4471-8c37-6603fb137822" containerID="3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c" exitCode=143 Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.559732 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.570350 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55c3c6e-766e-4471-8c37-6603fb137822","Type":"ContainerDied","Data":"4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451"} Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.570445 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55c3c6e-766e-4471-8c37-6603fb137822","Type":"ContainerDied","Data":"3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c"} Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.570461 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d55c3c6e-766e-4471-8c37-6603fb137822","Type":"ContainerDied","Data":"b7e4857469c7f6e4a3e28fa5d84f8ad8e2d73fb3bcb286611ac9332734d6e864"} Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.570484 4807 scope.go:117] "RemoveContainer" containerID="4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.600668 4807 scope.go:117] "RemoveContainer" containerID="3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.608707 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.627588 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.636589 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:36 crc kubenswrapper[4807]: E1127 11:28:36.636974 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55c3c6e-766e-4471-8c37-6603fb137822" containerName="nova-metadata-metadata" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.636991 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55c3c6e-766e-4471-8c37-6603fb137822" containerName="nova-metadata-metadata" Nov 27 11:28:36 crc kubenswrapper[4807]: E1127 11:28:36.637017 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d55c3c6e-766e-4471-8c37-6603fb137822" containerName="nova-metadata-log" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.637024 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d55c3c6e-766e-4471-8c37-6603fb137822" containerName="nova-metadata-log" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.637258 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d55c3c6e-766e-4471-8c37-6603fb137822" containerName="nova-metadata-log" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.637293 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d55c3c6e-766e-4471-8c37-6603fb137822" containerName="nova-metadata-metadata" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.637457 4807 scope.go:117] "RemoveContainer" containerID="4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451" Nov 27 11:28:36 crc kubenswrapper[4807]: E1127 11:28:36.638019 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451\": container with ID starting with 4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451 not found: ID does not exist" containerID="4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.638060 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451"} err="failed to get container status \"4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451\": rpc error: code = NotFound desc = could not find container \"4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451\": container with ID starting with 4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451 not found: ID does not exist" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.638085 4807 scope.go:117] "RemoveContainer" containerID="3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c" Nov 27 11:28:36 crc kubenswrapper[4807]: E1127 11:28:36.638390 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c\": container with ID starting with 3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c not found: ID does not exist" containerID="3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.638416 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c"} err="failed to get container status \"3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c\": rpc error: code = NotFound desc = could not find container \"3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c\": container with ID starting with 3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c not found: ID does not exist" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.638430 4807 scope.go:117] "RemoveContainer" containerID="4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.638579 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451"} err="failed to get container status \"4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451\": rpc error: code = NotFound desc = could not find container \"4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451\": container with ID starting with 4505d3d8820bd72002654482498c708643089eb57955c74c064bd6fed574b451 not found: ID does not exist" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.638596 4807 scope.go:117] "RemoveContainer" containerID="3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.638762 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c"} err="failed to get container status \"3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c\": rpc error: code = NotFound desc = could not find container \"3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c\": container with ID starting with 3dfaab8e43a742010f3062b6cfc291f9afde9fd291f51ac1f9ecae7fc125b43c not found: ID does not exist" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.639028 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.643169 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.643335 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.659182 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.710805 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-config-data\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.710855 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwxcc\" (UniqueName: \"kubernetes.io/projected/355b0a5f-b252-4df4-a977-10ea814fab65-kube-api-access-gwxcc\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.710967 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.711067 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.711098 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/355b0a5f-b252-4df4-a977-10ea814fab65-logs\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: E1127 11:28:36.742794 4807 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd55c3c6e_766e_4471_8c37_6603fb137822.slice\": RecentStats: unable to find data in memory cache]" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.813136 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwxcc\" (UniqueName: \"kubernetes.io/projected/355b0a5f-b252-4df4-a977-10ea814fab65-kube-api-access-gwxcc\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.813254 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.813328 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.813362 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/355b0a5f-b252-4df4-a977-10ea814fab65-logs\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.813382 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-config-data\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.813860 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/355b0a5f-b252-4df4-a977-10ea814fab65-logs\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.817430 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.817889 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-config-data\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.823767 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.834899 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwxcc\" (UniqueName: \"kubernetes.io/projected/355b0a5f-b252-4df4-a977-10ea814fab65-kube-api-access-gwxcc\") pod \"nova-metadata-0\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " pod="openstack/nova-metadata-0" Nov 27 11:28:36 crc kubenswrapper[4807]: I1127 11:28:36.957867 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:28:37 crc kubenswrapper[4807]: I1127 11:28:37.451541 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:37 crc kubenswrapper[4807]: W1127 11:28:37.453875 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod355b0a5f_b252_4df4_a977_10ea814fab65.slice/crio-846e8c4d3940d6d9df6ea95c67879e4c1d8571632bdb6b1fdc9e2ab0cb974224 WatchSource:0}: Error finding container 846e8c4d3940d6d9df6ea95c67879e4c1d8571632bdb6b1fdc9e2ab0cb974224: Status 404 returned error can't find the container with id 846e8c4d3940d6d9df6ea95c67879e4c1d8571632bdb6b1fdc9e2ab0cb974224 Nov 27 11:28:37 crc kubenswrapper[4807]: I1127 11:28:37.549628 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d55c3c6e-766e-4471-8c37-6603fb137822" path="/var/lib/kubelet/pods/d55c3c6e-766e-4471-8c37-6603fb137822/volumes" Nov 27 11:28:37 crc kubenswrapper[4807]: I1127 11:28:37.578128 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"355b0a5f-b252-4df4-a977-10ea814fab65","Type":"ContainerStarted","Data":"846e8c4d3940d6d9df6ea95c67879e4c1d8571632bdb6b1fdc9e2ab0cb974224"} Nov 27 11:28:38 crc kubenswrapper[4807]: I1127 11:28:38.589997 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"355b0a5f-b252-4df4-a977-10ea814fab65","Type":"ContainerStarted","Data":"e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b"} Nov 27 11:28:38 crc kubenswrapper[4807]: I1127 11:28:38.590395 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"355b0a5f-b252-4df4-a977-10ea814fab65","Type":"ContainerStarted","Data":"bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6"} Nov 27 11:28:38 crc kubenswrapper[4807]: I1127 11:28:38.593589 4807 generic.go:334] "Generic (PLEG): container finished" podID="d88895da-e0b3-40c7-82bf-eb68882e01cd" containerID="3a8f63156e2a97220998270496878d887c1bf32ef7b88dbac31e38b555e00e8c" exitCode=0 Nov 27 11:28:38 crc kubenswrapper[4807]: I1127 11:28:38.593702 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mrj5n" event={"ID":"d88895da-e0b3-40c7-82bf-eb68882e01cd","Type":"ContainerDied","Data":"3a8f63156e2a97220998270496878d887c1bf32ef7b88dbac31e38b555e00e8c"} Nov 27 11:28:38 crc kubenswrapper[4807]: I1127 11:28:38.595651 4807 generic.go:334] "Generic (PLEG): container finished" podID="9f6dc415-3063-48ab-8a84-27a041b110f4" containerID="d6188fb02897da9f75290e681f1df02212cf0a1b2f284fd917be65d34e698154" exitCode=0 Nov 27 11:28:38 crc kubenswrapper[4807]: I1127 11:28:38.595698 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mznhj" event={"ID":"9f6dc415-3063-48ab-8a84-27a041b110f4","Type":"ContainerDied","Data":"d6188fb02897da9f75290e681f1df02212cf0a1b2f284fd917be65d34e698154"} Nov 27 11:28:38 crc kubenswrapper[4807]: I1127 11:28:38.610026 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.610010724 podStartE2EDuration="2.610010724s" podCreationTimestamp="2025-11-27 11:28:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:28:38.608125135 +0000 UTC m=+1159.707623343" watchObservedRunningTime="2025-11-27 11:28:38.610010724 +0000 UTC m=+1159.709508922" Nov 27 11:28:39 crc kubenswrapper[4807]: I1127 11:28:39.751720 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 11:28:39 crc kubenswrapper[4807]: I1127 11:28:39.752002 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 11:28:39 crc kubenswrapper[4807]: I1127 11:28:39.963751 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.016485 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.016517 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.024565 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.130423 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gvl2d"] Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.130629 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" podUID="7af73ef1-bb7d-4575-bc4e-3cff7945f644" containerName="dnsmasq-dns" containerID="cri-o://30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5" gracePeriod=10 Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.147822 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.308361 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.320837 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.398224 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-scripts\") pod \"d88895da-e0b3-40c7-82bf-eb68882e01cd\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.398311 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-config-data\") pod \"d88895da-e0b3-40c7-82bf-eb68882e01cd\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.398366 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpchd\" (UniqueName: \"kubernetes.io/projected/d88895da-e0b3-40c7-82bf-eb68882e01cd-kube-api-access-vpchd\") pod \"d88895da-e0b3-40c7-82bf-eb68882e01cd\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.398413 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-config-data\") pod \"9f6dc415-3063-48ab-8a84-27a041b110f4\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.398444 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-combined-ca-bundle\") pod \"9f6dc415-3063-48ab-8a84-27a041b110f4\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.398477 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flx89\" (UniqueName: \"kubernetes.io/projected/9f6dc415-3063-48ab-8a84-27a041b110f4-kube-api-access-flx89\") pod \"9f6dc415-3063-48ab-8a84-27a041b110f4\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.398582 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-scripts\") pod \"9f6dc415-3063-48ab-8a84-27a041b110f4\" (UID: \"9f6dc415-3063-48ab-8a84-27a041b110f4\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.398631 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-combined-ca-bundle\") pod \"d88895da-e0b3-40c7-82bf-eb68882e01cd\" (UID: \"d88895da-e0b3-40c7-82bf-eb68882e01cd\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.405003 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-scripts" (OuterVolumeSpecName: "scripts") pod "9f6dc415-3063-48ab-8a84-27a041b110f4" (UID: "9f6dc415-3063-48ab-8a84-27a041b110f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.405050 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-scripts" (OuterVolumeSpecName: "scripts") pod "d88895da-e0b3-40c7-82bf-eb68882e01cd" (UID: "d88895da-e0b3-40c7-82bf-eb68882e01cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.407213 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6dc415-3063-48ab-8a84-27a041b110f4-kube-api-access-flx89" (OuterVolumeSpecName: "kube-api-access-flx89") pod "9f6dc415-3063-48ab-8a84-27a041b110f4" (UID: "9f6dc415-3063-48ab-8a84-27a041b110f4"). InnerVolumeSpecName "kube-api-access-flx89". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.408130 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d88895da-e0b3-40c7-82bf-eb68882e01cd-kube-api-access-vpchd" (OuterVolumeSpecName: "kube-api-access-vpchd") pod "d88895da-e0b3-40c7-82bf-eb68882e01cd" (UID: "d88895da-e0b3-40c7-82bf-eb68882e01cd"). InnerVolumeSpecName "kube-api-access-vpchd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.434206 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d88895da-e0b3-40c7-82bf-eb68882e01cd" (UID: "d88895da-e0b3-40c7-82bf-eb68882e01cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.434585 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f6dc415-3063-48ab-8a84-27a041b110f4" (UID: "9f6dc415-3063-48ab-8a84-27a041b110f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.434923 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-config-data" (OuterVolumeSpecName: "config-data") pod "9f6dc415-3063-48ab-8a84-27a041b110f4" (UID: "9f6dc415-3063-48ab-8a84-27a041b110f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.447380 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-config-data" (OuterVolumeSpecName: "config-data") pod "d88895da-e0b3-40c7-82bf-eb68882e01cd" (UID: "d88895da-e0b3-40c7-82bf-eb68882e01cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.500720 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.501005 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.501018 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flx89\" (UniqueName: \"kubernetes.io/projected/9f6dc415-3063-48ab-8a84-27a041b110f4-kube-api-access-flx89\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.501026 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f6dc415-3063-48ab-8a84-27a041b110f4-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.501034 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.501043 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.501051 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d88895da-e0b3-40c7-82bf-eb68882e01cd-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.501060 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpchd\" (UniqueName: \"kubernetes.io/projected/d88895da-e0b3-40c7-82bf-eb68882e01cd-kube-api-access-vpchd\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.571425 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.620665 4807 generic.go:334] "Generic (PLEG): container finished" podID="7af73ef1-bb7d-4575-bc4e-3cff7945f644" containerID="30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5" exitCode=0 Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.620762 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" event={"ID":"7af73ef1-bb7d-4575-bc4e-3cff7945f644","Type":"ContainerDied","Data":"30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5"} Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.620820 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" event={"ID":"7af73ef1-bb7d-4575-bc4e-3cff7945f644","Type":"ContainerDied","Data":"ff8157db1781b5fcb60d1f5a5f7f1701ed6498f25257866abe985b6e0a2f4b1f"} Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.620841 4807 scope.go:117] "RemoveContainer" containerID="30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.621141 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.631451 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-mrj5n" event={"ID":"d88895da-e0b3-40c7-82bf-eb68882e01cd","Type":"ContainerDied","Data":"48122bfef2c354c2c3f50a77f3f7f57a360a0e41745157f496160debe6b4f8f3"} Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.631486 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48122bfef2c354c2c3f50a77f3f7f57a360a0e41745157f496160debe6b4f8f3" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.631553 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-mrj5n" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.639926 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mznhj" event={"ID":"9f6dc415-3063-48ab-8a84-27a041b110f4","Type":"ContainerDied","Data":"37ed51d24f8c5e77bdf66a5a66ba99e9f87c7ba9df1071fe68f39813e2448502"} Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.639982 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37ed51d24f8c5e77bdf66a5a66ba99e9f87c7ba9df1071fe68f39813e2448502" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.640298 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mznhj" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.661435 4807 scope.go:117] "RemoveContainer" containerID="2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.699290 4807 scope.go:117] "RemoveContainer" containerID="30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5" Nov 27 11:28:40 crc kubenswrapper[4807]: E1127 11:28:40.699698 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5\": container with ID starting with 30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5 not found: ID does not exist" containerID="30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.699727 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5"} err="failed to get container status \"30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5\": rpc error: code = NotFound desc = could not find container \"30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5\": container with ID starting with 30e532528e1d6da8394614fb252c223a54036e145f64e638e053c9ce706ce0b5 not found: ID does not exist" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.699747 4807 scope.go:117] "RemoveContainer" containerID="2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525" Nov 27 11:28:40 crc kubenswrapper[4807]: E1127 11:28:40.699916 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525\": container with ID starting with 2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525 not found: ID does not exist" containerID="2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.699940 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525"} err="failed to get container status \"2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525\": rpc error: code = NotFound desc = could not find container \"2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525\": container with ID starting with 2bdb66c974c85074f4ff2073d5f0c95e1b6455e79f00132e7742f84b5d3a4525 not found: ID does not exist" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.705675 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-svc\") pod \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.705729 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-nb\") pod \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.705777 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-config\") pod \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.705800 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4ph8\" (UniqueName: \"kubernetes.io/projected/7af73ef1-bb7d-4575-bc4e-3cff7945f644-kube-api-access-g4ph8\") pod \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.705901 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-swift-storage-0\") pod \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.706034 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-sb\") pod \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\" (UID: \"7af73ef1-bb7d-4575-bc4e-3cff7945f644\") " Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.713211 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.713277 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af73ef1-bb7d-4575-bc4e-3cff7945f644-kube-api-access-g4ph8" (OuterVolumeSpecName: "kube-api-access-g4ph8") pod "7af73ef1-bb7d-4575-bc4e-3cff7945f644" (UID: "7af73ef1-bb7d-4575-bc4e-3cff7945f644"). InnerVolumeSpecName "kube-api-access-g4ph8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.728411 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 27 11:28:40 crc kubenswrapper[4807]: E1127 11:28:40.728781 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af73ef1-bb7d-4575-bc4e-3cff7945f644" containerName="dnsmasq-dns" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.728797 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af73ef1-bb7d-4575-bc4e-3cff7945f644" containerName="dnsmasq-dns" Nov 27 11:28:40 crc kubenswrapper[4807]: E1127 11:28:40.728814 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6dc415-3063-48ab-8a84-27a041b110f4" containerName="nova-cell1-conductor-db-sync" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.728820 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6dc415-3063-48ab-8a84-27a041b110f4" containerName="nova-cell1-conductor-db-sync" Nov 27 11:28:40 crc kubenswrapper[4807]: E1127 11:28:40.728844 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af73ef1-bb7d-4575-bc4e-3cff7945f644" containerName="init" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.728849 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af73ef1-bb7d-4575-bc4e-3cff7945f644" containerName="init" Nov 27 11:28:40 crc kubenswrapper[4807]: E1127 11:28:40.728856 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d88895da-e0b3-40c7-82bf-eb68882e01cd" containerName="nova-manage" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.728862 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88895da-e0b3-40c7-82bf-eb68882e01cd" containerName="nova-manage" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.729051 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af73ef1-bb7d-4575-bc4e-3cff7945f644" containerName="dnsmasq-dns" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.729123 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6dc415-3063-48ab-8a84-27a041b110f4" containerName="nova-cell1-conductor-db-sync" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.729140 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d88895da-e0b3-40c7-82bf-eb68882e01cd" containerName="nova-manage" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.730520 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.732473 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.743816 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.794614 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7af73ef1-bb7d-4575-bc4e-3cff7945f644" (UID: "7af73ef1-bb7d-4575-bc4e-3cff7945f644"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.797440 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7af73ef1-bb7d-4575-bc4e-3cff7945f644" (UID: "7af73ef1-bb7d-4575-bc4e-3cff7945f644"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.802479 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7af73ef1-bb7d-4575-bc4e-3cff7945f644" (UID: "7af73ef1-bb7d-4575-bc4e-3cff7945f644"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.808387 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f104b79-cd6b-4d1b-9ad9-a508e5ec636b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b\") " pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.808458 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f104b79-cd6b-4d1b-9ad9-a508e5ec636b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b\") " pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.808549 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnwk\" (UniqueName: \"kubernetes.io/projected/2f104b79-cd6b-4d1b-9ad9-a508e5ec636b-kube-api-access-qjnwk\") pod \"nova-cell1-conductor-0\" (UID: \"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b\") " pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.808668 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.808680 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.808692 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.808718 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4ph8\" (UniqueName: \"kubernetes.io/projected/7af73ef1-bb7d-4575-bc4e-3cff7945f644-kube-api-access-g4ph8\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.812666 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7af73ef1-bb7d-4575-bc4e-3cff7945f644" (UID: "7af73ef1-bb7d-4575-bc4e-3cff7945f644"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.820647 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-config" (OuterVolumeSpecName: "config") pod "7af73ef1-bb7d-4575-bc4e-3cff7945f644" (UID: "7af73ef1-bb7d-4575-bc4e-3cff7945f644"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.836722 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.838995 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.879703 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.879944 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerName="nova-api-log" containerID="cri-o://bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283" gracePeriod=30 Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.880601 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerName="nova-api-api" containerID="cri-o://16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc" gracePeriod=30 Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.896461 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.897852 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="355b0a5f-b252-4df4-a977-10ea814fab65" containerName="nova-metadata-log" containerID="cri-o://bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6" gracePeriod=30 Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.897952 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="355b0a5f-b252-4df4-a977-10ea814fab65" containerName="nova-metadata-metadata" containerID="cri-o://e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b" gracePeriod=30 Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.910711 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnwk\" (UniqueName: \"kubernetes.io/projected/2f104b79-cd6b-4d1b-9ad9-a508e5ec636b-kube-api-access-qjnwk\") pod \"nova-cell1-conductor-0\" (UID: \"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b\") " pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.910853 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f104b79-cd6b-4d1b-9ad9-a508e5ec636b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b\") " pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.910956 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f104b79-cd6b-4d1b-9ad9-a508e5ec636b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b\") " pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.911049 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.911069 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7af73ef1-bb7d-4575-bc4e-3cff7945f644-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.916010 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f104b79-cd6b-4d1b-9ad9-a508e5ec636b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b\") " pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.922919 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f104b79-cd6b-4d1b-9ad9-a508e5ec636b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b\") " pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:40 crc kubenswrapper[4807]: I1127 11:28:40.944133 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnwk\" (UniqueName: \"kubernetes.io/projected/2f104b79-cd6b-4d1b-9ad9-a508e5ec636b-kube-api-access-qjnwk\") pod \"nova-cell1-conductor-0\" (UID: \"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b\") " pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.074017 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gvl2d"] Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.085947 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.092831 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-gvl2d"] Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.250385 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.544841 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7af73ef1-bb7d-4575-bc4e-3cff7945f644" path="/var/lib/kubelet/pods/7af73ef1-bb7d-4575-bc4e-3cff7945f644/volumes" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.554556 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.632567 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/355b0a5f-b252-4df4-a977-10ea814fab65-logs\") pod \"355b0a5f-b252-4df4-a977-10ea814fab65\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.632648 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-nova-metadata-tls-certs\") pod \"355b0a5f-b252-4df4-a977-10ea814fab65\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.632710 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwxcc\" (UniqueName: \"kubernetes.io/projected/355b0a5f-b252-4df4-a977-10ea814fab65-kube-api-access-gwxcc\") pod \"355b0a5f-b252-4df4-a977-10ea814fab65\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.632809 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-config-data\") pod \"355b0a5f-b252-4df4-a977-10ea814fab65\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.632928 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-combined-ca-bundle\") pod \"355b0a5f-b252-4df4-a977-10ea814fab65\" (UID: \"355b0a5f-b252-4df4-a977-10ea814fab65\") " Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.634545 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/355b0a5f-b252-4df4-a977-10ea814fab65-logs" (OuterVolumeSpecName: "logs") pod "355b0a5f-b252-4df4-a977-10ea814fab65" (UID: "355b0a5f-b252-4df4-a977-10ea814fab65"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.641363 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/355b0a5f-b252-4df4-a977-10ea814fab65-kube-api-access-gwxcc" (OuterVolumeSpecName: "kube-api-access-gwxcc") pod "355b0a5f-b252-4df4-a977-10ea814fab65" (UID: "355b0a5f-b252-4df4-a977-10ea814fab65"). InnerVolumeSpecName "kube-api-access-gwxcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.661610 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.662366 4807 generic.go:334] "Generic (PLEG): container finished" podID="355b0a5f-b252-4df4-a977-10ea814fab65" containerID="e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b" exitCode=0 Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.662388 4807 generic.go:334] "Generic (PLEG): container finished" podID="355b0a5f-b252-4df4-a977-10ea814fab65" containerID="bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6" exitCode=143 Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.662431 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"355b0a5f-b252-4df4-a977-10ea814fab65","Type":"ContainerDied","Data":"e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b"} Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.662432 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.662451 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"355b0a5f-b252-4df4-a977-10ea814fab65","Type":"ContainerDied","Data":"bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6"} Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.662462 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"355b0a5f-b252-4df4-a977-10ea814fab65","Type":"ContainerDied","Data":"846e8c4d3940d6d9df6ea95c67879e4c1d8571632bdb6b1fdc9e2ab0cb974224"} Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.662475 4807 scope.go:117] "RemoveContainer" containerID="e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.667185 4807 generic.go:334] "Generic (PLEG): container finished" podID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerID="bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283" exitCode=143 Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.667578 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a","Type":"ContainerDied","Data":"bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283"} Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.681410 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-config-data" (OuterVolumeSpecName: "config-data") pod "355b0a5f-b252-4df4-a977-10ea814fab65" (UID: "355b0a5f-b252-4df4-a977-10ea814fab65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.696428 4807 scope.go:117] "RemoveContainer" containerID="bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.704333 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "355b0a5f-b252-4df4-a977-10ea814fab65" (UID: "355b0a5f-b252-4df4-a977-10ea814fab65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.705030 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "355b0a5f-b252-4df4-a977-10ea814fab65" (UID: "355b0a5f-b252-4df4-a977-10ea814fab65"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.735435 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.735461 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.735471 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/355b0a5f-b252-4df4-a977-10ea814fab65-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.735479 4807 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/355b0a5f-b252-4df4-a977-10ea814fab65-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.735489 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwxcc\" (UniqueName: \"kubernetes.io/projected/355b0a5f-b252-4df4-a977-10ea814fab65-kube-api-access-gwxcc\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.743198 4807 scope.go:117] "RemoveContainer" containerID="e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b" Nov 27 11:28:41 crc kubenswrapper[4807]: E1127 11:28:41.743608 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b\": container with ID starting with e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b not found: ID does not exist" containerID="e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.743658 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b"} err="failed to get container status \"e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b\": rpc error: code = NotFound desc = could not find container \"e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b\": container with ID starting with e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b not found: ID does not exist" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.743695 4807 scope.go:117] "RemoveContainer" containerID="bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6" Nov 27 11:28:41 crc kubenswrapper[4807]: E1127 11:28:41.743894 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6\": container with ID starting with bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6 not found: ID does not exist" containerID="bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.743915 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6"} err="failed to get container status \"bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6\": rpc error: code = NotFound desc = could not find container \"bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6\": container with ID starting with bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6 not found: ID does not exist" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.743944 4807 scope.go:117] "RemoveContainer" containerID="e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.744150 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b"} err="failed to get container status \"e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b\": rpc error: code = NotFound desc = could not find container \"e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b\": container with ID starting with e435d2d3c8a4d084cd1d9d5f78f1a702be4ddbf8cb1753627295197ffd1a0d2b not found: ID does not exist" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.744194 4807 scope.go:117] "RemoveContainer" containerID="bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6" Nov 27 11:28:41 crc kubenswrapper[4807]: I1127 11:28:41.744668 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6"} err="failed to get container status \"bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6\": rpc error: code = NotFound desc = could not find container \"bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6\": container with ID starting with bfc8c551209dda80e6b562f064b2d93a3c9fa8e1458c65061cde58e8cbaaa0e6 not found: ID does not exist" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.013090 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.031615 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.042572 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:42 crc kubenswrapper[4807]: E1127 11:28:42.043030 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355b0a5f-b252-4df4-a977-10ea814fab65" containerName="nova-metadata-metadata" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.043051 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="355b0a5f-b252-4df4-a977-10ea814fab65" containerName="nova-metadata-metadata" Nov 27 11:28:42 crc kubenswrapper[4807]: E1127 11:28:42.043078 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="355b0a5f-b252-4df4-a977-10ea814fab65" containerName="nova-metadata-log" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.043086 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="355b0a5f-b252-4df4-a977-10ea814fab65" containerName="nova-metadata-log" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.043372 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="355b0a5f-b252-4df4-a977-10ea814fab65" containerName="nova-metadata-metadata" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.043412 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="355b0a5f-b252-4df4-a977-10ea814fab65" containerName="nova-metadata-log" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.044447 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.046930 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.047068 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.050780 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.142710 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.142767 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.142926 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-config-data\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.143088 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-logs\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.143182 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52m4f\" (UniqueName: \"kubernetes.io/projected/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-kube-api-access-52m4f\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.245078 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.245138 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.245195 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-config-data\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.245263 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-logs\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.245298 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52m4f\" (UniqueName: \"kubernetes.io/projected/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-kube-api-access-52m4f\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.246460 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-logs\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.249758 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.261731 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-config-data\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.262882 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.264377 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52m4f\" (UniqueName: \"kubernetes.io/projected/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-kube-api-access-52m4f\") pod \"nova-metadata-0\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.383574 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.644678 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:28:42 crc kubenswrapper[4807]: W1127 11:28:42.650040 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80af84d5_da7b_4b8f_b7ab_1eef6e66e3e4.slice/crio-db3381d1c55ce71d196ab014af475dae54a4b36fcdc954cf0e1c76dc862f029e WatchSource:0}: Error finding container db3381d1c55ce71d196ab014af475dae54a4b36fcdc954cf0e1c76dc862f029e: Status 404 returned error can't find the container with id db3381d1c55ce71d196ab014af475dae54a4b36fcdc954cf0e1c76dc862f029e Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.700363 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4","Type":"ContainerStarted","Data":"db3381d1c55ce71d196ab014af475dae54a4b36fcdc954cf0e1c76dc862f029e"} Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.701951 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b","Type":"ContainerStarted","Data":"3e6121ad6f538420b618f3f235e73c1069bc7562e4c616bdd20fba3355cab9c4"} Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.702031 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2f104b79-cd6b-4d1b-9ad9-a508e5ec636b","Type":"ContainerStarted","Data":"2d8196ef72b19592b75917d4c1a7c7b1f09819aba5c356e8abd942ef812e03b6"} Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.702004 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d3592034-67db-49ff-80ff-c3991a5dbaf7" containerName="nova-scheduler-scheduler" containerID="cri-o://a81a1e9bd997da7ef7e9ef645aceeea29381656939624d7907502d48170b4d58" gracePeriod=30 Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.702349 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:42 crc kubenswrapper[4807]: I1127 11:28:42.729157 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.729141752 podStartE2EDuration="2.729141752s" podCreationTimestamp="2025-11-27 11:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:28:42.725055676 +0000 UTC m=+1163.824553874" watchObservedRunningTime="2025-11-27 11:28:42.729141752 +0000 UTC m=+1163.828639951" Nov 27 11:28:43 crc kubenswrapper[4807]: I1127 11:28:43.542164 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="355b0a5f-b252-4df4-a977-10ea814fab65" path="/var/lib/kubelet/pods/355b0a5f-b252-4df4-a977-10ea814fab65/volumes" Nov 27 11:28:43 crc kubenswrapper[4807]: I1127 11:28:43.713670 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4","Type":"ContainerStarted","Data":"94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4"} Nov 27 11:28:43 crc kubenswrapper[4807]: I1127 11:28:43.713946 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4","Type":"ContainerStarted","Data":"b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60"} Nov 27 11:28:43 crc kubenswrapper[4807]: I1127 11:28:43.745954 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.745908485 podStartE2EDuration="1.745908485s" podCreationTimestamp="2025-11-27 11:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:28:43.744639202 +0000 UTC m=+1164.844137410" watchObservedRunningTime="2025-11-27 11:28:43.745908485 +0000 UTC m=+1164.845406683" Nov 27 11:28:45 crc kubenswrapper[4807]: E1127 11:28:45.018458 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a81a1e9bd997da7ef7e9ef645aceeea29381656939624d7907502d48170b4d58" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 11:28:45 crc kubenswrapper[4807]: E1127 11:28:45.020322 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a81a1e9bd997da7ef7e9ef645aceeea29381656939624d7907502d48170b4d58" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 11:28:45 crc kubenswrapper[4807]: E1127 11:28:45.021423 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a81a1e9bd997da7ef7e9ef645aceeea29381656939624d7907502d48170b4d58" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 11:28:45 crc kubenswrapper[4807]: E1127 11:28:45.021452 4807 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d3592034-67db-49ff-80ff-c3991a5dbaf7" containerName="nova-scheduler-scheduler" Nov 27 11:28:45 crc kubenswrapper[4807]: I1127 11:28:45.526489 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-gvl2d" podUID="7af73ef1-bb7d-4575-bc4e-3cff7945f644" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: i/o timeout" Nov 27 11:28:46 crc kubenswrapper[4807]: I1127 11:28:46.128779 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 27 11:28:46 crc kubenswrapper[4807]: I1127 11:28:46.743722 4807 generic.go:334] "Generic (PLEG): container finished" podID="d3592034-67db-49ff-80ff-c3991a5dbaf7" containerID="a81a1e9bd997da7ef7e9ef645aceeea29381656939624d7907502d48170b4d58" exitCode=0 Nov 27 11:28:46 crc kubenswrapper[4807]: I1127 11:28:46.743764 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3592034-67db-49ff-80ff-c3991a5dbaf7","Type":"ContainerDied","Data":"a81a1e9bd997da7ef7e9ef645aceeea29381656939624d7907502d48170b4d58"} Nov 27 11:28:46 crc kubenswrapper[4807]: I1127 11:28:46.874345 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.054922 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-combined-ca-bundle\") pod \"d3592034-67db-49ff-80ff-c3991a5dbaf7\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.055569 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-config-data\") pod \"d3592034-67db-49ff-80ff-c3991a5dbaf7\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.055851 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd8dd\" (UniqueName: \"kubernetes.io/projected/d3592034-67db-49ff-80ff-c3991a5dbaf7-kube-api-access-wd8dd\") pod \"d3592034-67db-49ff-80ff-c3991a5dbaf7\" (UID: \"d3592034-67db-49ff-80ff-c3991a5dbaf7\") " Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.100426 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3592034-67db-49ff-80ff-c3991a5dbaf7-kube-api-access-wd8dd" (OuterVolumeSpecName: "kube-api-access-wd8dd") pod "d3592034-67db-49ff-80ff-c3991a5dbaf7" (UID: "d3592034-67db-49ff-80ff-c3991a5dbaf7"). InnerVolumeSpecName "kube-api-access-wd8dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.119274 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3592034-67db-49ff-80ff-c3991a5dbaf7" (UID: "d3592034-67db-49ff-80ff-c3991a5dbaf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.133723 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-config-data" (OuterVolumeSpecName: "config-data") pod "d3592034-67db-49ff-80ff-c3991a5dbaf7" (UID: "d3592034-67db-49ff-80ff-c3991a5dbaf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.158051 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.158081 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3592034-67db-49ff-80ff-c3991a5dbaf7-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.158091 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd8dd\" (UniqueName: \"kubernetes.io/projected/d3592034-67db-49ff-80ff-c3991a5dbaf7-kube-api-access-wd8dd\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.384589 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.385116 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.725554 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.758646 4807 generic.go:334] "Generic (PLEG): container finished" podID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerID="16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc" exitCode=0 Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.758751 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a","Type":"ContainerDied","Data":"16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc"} Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.758778 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a","Type":"ContainerDied","Data":"14bf94e0865a4848cfbeaa376a1aa2df6b17af3a1fdd24f735210f0f24e9ff00"} Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.758797 4807 scope.go:117] "RemoveContainer" containerID="16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.759022 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.776477 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.776616 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d3592034-67db-49ff-80ff-c3991a5dbaf7","Type":"ContainerDied","Data":"2f88a0a892065e96a1c9b00db127ba2a4b3a14a3e386f54c02e420e0018b431d"} Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.799237 4807 scope.go:117] "RemoveContainer" containerID="bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.846069 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.855149 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.857814 4807 scope.go:117] "RemoveContainer" containerID="16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc" Nov 27 11:28:47 crc kubenswrapper[4807]: E1127 11:28:47.859370 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc\": container with ID starting with 16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc not found: ID does not exist" containerID="16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.859415 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc"} err="failed to get container status \"16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc\": rpc error: code = NotFound desc = could not find container \"16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc\": container with ID starting with 16dd8945d8db24daad080962d2db480967a48873a14baf6c1057cd99029c38fc not found: ID does not exist" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.859447 4807 scope.go:117] "RemoveContainer" containerID="bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283" Nov 27 11:28:47 crc kubenswrapper[4807]: E1127 11:28:47.859687 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283\": container with ID starting with bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283 not found: ID does not exist" containerID="bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.859725 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283"} err="failed to get container status \"bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283\": rpc error: code = NotFound desc = could not find container \"bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283\": container with ID starting with bc025f35f1b6de460e73b2b59f7cd8389f3bc608fea22757d856a900f8702283 not found: ID does not exist" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.859745 4807 scope.go:117] "RemoveContainer" containerID="a81a1e9bd997da7ef7e9ef645aceeea29381656939624d7907502d48170b4d58" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.866737 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:28:47 crc kubenswrapper[4807]: E1127 11:28:47.867162 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerName="nova-api-api" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.867173 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerName="nova-api-api" Nov 27 11:28:47 crc kubenswrapper[4807]: E1127 11:28:47.867192 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerName="nova-api-log" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.867198 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerName="nova-api-log" Nov 27 11:28:47 crc kubenswrapper[4807]: E1127 11:28:47.867209 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3592034-67db-49ff-80ff-c3991a5dbaf7" containerName="nova-scheduler-scheduler" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.867215 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3592034-67db-49ff-80ff-c3991a5dbaf7" containerName="nova-scheduler-scheduler" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.867406 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3592034-67db-49ff-80ff-c3991a5dbaf7" containerName="nova-scheduler-scheduler" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.867417 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerName="nova-api-log" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.867436 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" containerName="nova-api-api" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.868041 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.869636 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.885590 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.890019 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-logs\") pod \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.890094 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-config-data\") pod \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.890285 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cg88\" (UniqueName: \"kubernetes.io/projected/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-kube-api-access-9cg88\") pod \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.890331 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-combined-ca-bundle\") pod \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\" (UID: \"cb91123e-7df5-46ff-bd9c-68eb9cd8f76a\") " Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.890767 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-logs" (OuterVolumeSpecName: "logs") pod "cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" (UID: "cb91123e-7df5-46ff-bd9c-68eb9cd8f76a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.894473 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-kube-api-access-9cg88" (OuterVolumeSpecName: "kube-api-access-9cg88") pod "cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" (UID: "cb91123e-7df5-46ff-bd9c-68eb9cd8f76a"). InnerVolumeSpecName "kube-api-access-9cg88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.917860 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-config-data" (OuterVolumeSpecName: "config-data") pod "cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" (UID: "cb91123e-7df5-46ff-bd9c-68eb9cd8f76a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.927117 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" (UID: "cb91123e-7df5-46ff-bd9c-68eb9cd8f76a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.992619 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l84z\" (UniqueName: \"kubernetes.io/projected/d8515dce-7827-4661-ad4b-ca66a948eeeb-kube-api-access-5l84z\") pod \"nova-scheduler-0\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.992694 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.992801 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-config-data\") pod \"nova-scheduler-0\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.992859 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cg88\" (UniqueName: \"kubernetes.io/projected/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-kube-api-access-9cg88\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.992870 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.992879 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:47 crc kubenswrapper[4807]: I1127 11:28:47.992888 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.094432 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-config-data\") pod \"nova-scheduler-0\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.094499 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l84z\" (UniqueName: \"kubernetes.io/projected/d8515dce-7827-4661-ad4b-ca66a948eeeb-kube-api-access-5l84z\") pod \"nova-scheduler-0\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.094546 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.094716 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.098976 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.099691 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-config-data\") pod \"nova-scheduler-0\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.117002 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l84z\" (UniqueName: \"kubernetes.io/projected/d8515dce-7827-4661-ad4b-ca66a948eeeb-kube-api-access-5l84z\") pod \"nova-scheduler-0\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " pod="openstack/nova-scheduler-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.117064 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.126134 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.127941 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.129653 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.136009 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.191983 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.297198 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkngz\" (UniqueName: \"kubernetes.io/projected/65c54290-ae69-4c00-8968-3aceb71768c2-kube-api-access-lkngz\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.297571 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-config-data\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.297604 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.297682 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65c54290-ae69-4c00-8968-3aceb71768c2-logs\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.399730 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-config-data\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.399802 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.399920 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65c54290-ae69-4c00-8968-3aceb71768c2-logs\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.399980 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkngz\" (UniqueName: \"kubernetes.io/projected/65c54290-ae69-4c00-8968-3aceb71768c2-kube-api-access-lkngz\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.400664 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65c54290-ae69-4c00-8968-3aceb71768c2-logs\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.408310 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.408443 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-config-data\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.423869 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkngz\" (UniqueName: \"kubernetes.io/projected/65c54290-ae69-4c00-8968-3aceb71768c2-kube-api-access-lkngz\") pod \"nova-api-0\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.444876 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.609374 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:28:48 crc kubenswrapper[4807]: W1127 11:28:48.610423 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8515dce_7827_4661_ad4b_ca66a948eeeb.slice/crio-fdc232336fb2c2f29bfb981c97aa8c0e86c4086a0140b688b6c51304bfb269df WatchSource:0}: Error finding container fdc232336fb2c2f29bfb981c97aa8c0e86c4086a0140b688b6c51304bfb269df: Status 404 returned error can't find the container with id fdc232336fb2c2f29bfb981c97aa8c0e86c4086a0140b688b6c51304bfb269df Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.786748 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d8515dce-7827-4661-ad4b-ca66a948eeeb","Type":"ContainerStarted","Data":"ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13"} Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.786798 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d8515dce-7827-4661-ad4b-ca66a948eeeb","Type":"ContainerStarted","Data":"fdc232336fb2c2f29bfb981c97aa8c0e86c4086a0140b688b6c51304bfb269df"} Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.804619 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.804602838 podStartE2EDuration="1.804602838s" podCreationTimestamp="2025-11-27 11:28:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:28:48.801847116 +0000 UTC m=+1169.901345324" watchObservedRunningTime="2025-11-27 11:28:48.804602838 +0000 UTC m=+1169.904101036" Nov 27 11:28:48 crc kubenswrapper[4807]: I1127 11:28:48.857631 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:28:48 crc kubenswrapper[4807]: W1127 11:28:48.859670 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65c54290_ae69_4c00_8968_3aceb71768c2.slice/crio-3777ef085d75a6c5bd1cbe578fea69e4f86df4fb3664af97e61b3266d13ab0e4 WatchSource:0}: Error finding container 3777ef085d75a6c5bd1cbe578fea69e4f86df4fb3664af97e61b3266d13ab0e4: Status 404 returned error can't find the container with id 3777ef085d75a6c5bd1cbe578fea69e4f86df4fb3664af97e61b3266d13ab0e4 Nov 27 11:28:49 crc kubenswrapper[4807]: I1127 11:28:49.541782 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb91123e-7df5-46ff-bd9c-68eb9cd8f76a" path="/var/lib/kubelet/pods/cb91123e-7df5-46ff-bd9c-68eb9cd8f76a/volumes" Nov 27 11:28:49 crc kubenswrapper[4807]: I1127 11:28:49.542646 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3592034-67db-49ff-80ff-c3991a5dbaf7" path="/var/lib/kubelet/pods/d3592034-67db-49ff-80ff-c3991a5dbaf7/volumes" Nov 27 11:28:49 crc kubenswrapper[4807]: I1127 11:28:49.800991 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65c54290-ae69-4c00-8968-3aceb71768c2","Type":"ContainerStarted","Data":"6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7"} Nov 27 11:28:49 crc kubenswrapper[4807]: I1127 11:28:49.801503 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65c54290-ae69-4c00-8968-3aceb71768c2","Type":"ContainerStarted","Data":"9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e"} Nov 27 11:28:49 crc kubenswrapper[4807]: I1127 11:28:49.801536 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65c54290-ae69-4c00-8968-3aceb71768c2","Type":"ContainerStarted","Data":"3777ef085d75a6c5bd1cbe578fea69e4f86df4fb3664af97e61b3266d13ab0e4"} Nov 27 11:28:49 crc kubenswrapper[4807]: I1127 11:28:49.823605 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.823587418 podStartE2EDuration="1.823587418s" podCreationTimestamp="2025-11-27 11:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:28:49.821042551 +0000 UTC m=+1170.920540769" watchObservedRunningTime="2025-11-27 11:28:49.823587418 +0000 UTC m=+1170.923085616" Nov 27 11:28:52 crc kubenswrapper[4807]: I1127 11:28:52.397432 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 11:28:52 crc kubenswrapper[4807]: I1127 11:28:52.397771 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 11:28:53 crc kubenswrapper[4807]: I1127 11:28:53.192374 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 27 11:28:53 crc kubenswrapper[4807]: I1127 11:28:53.415533 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 11:28:53 crc kubenswrapper[4807]: I1127 11:28:53.415536 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 11:28:55 crc kubenswrapper[4807]: I1127 11:28:55.022730 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 27 11:28:58 crc kubenswrapper[4807]: I1127 11:28:58.192433 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 27 11:28:58 crc kubenswrapper[4807]: I1127 11:28:58.219368 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 27 11:28:58 crc kubenswrapper[4807]: I1127 11:28:58.445446 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 11:28:58 crc kubenswrapper[4807]: I1127 11:28:58.446033 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 11:28:58 crc kubenswrapper[4807]: I1127 11:28:58.674893 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 11:28:58 crc kubenswrapper[4807]: I1127 11:28:58.675153 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8cfd9070-b1bd-4a24-b694-ae4ff059ec1c" containerName="kube-state-metrics" containerID="cri-o://0e65cdd448122001009f7cd9fc198fd59b27840371789a1c62bddb5984f32655" gracePeriod=30 Nov 27 11:28:58 crc kubenswrapper[4807]: I1127 11:28:58.927729 4807 generic.go:334] "Generic (PLEG): container finished" podID="8cfd9070-b1bd-4a24-b694-ae4ff059ec1c" containerID="0e65cdd448122001009f7cd9fc198fd59b27840371789a1c62bddb5984f32655" exitCode=2 Nov 27 11:28:58 crc kubenswrapper[4807]: I1127 11:28:58.929640 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8cfd9070-b1bd-4a24-b694-ae4ff059ec1c","Type":"ContainerDied","Data":"0e65cdd448122001009f7cd9fc198fd59b27840371789a1c62bddb5984f32655"} Nov 27 11:28:58 crc kubenswrapper[4807]: I1127 11:28:58.971158 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.197810 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.332106 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcr4j\" (UniqueName: \"kubernetes.io/projected/8cfd9070-b1bd-4a24-b694-ae4ff059ec1c-kube-api-access-pcr4j\") pod \"8cfd9070-b1bd-4a24-b694-ae4ff059ec1c\" (UID: \"8cfd9070-b1bd-4a24-b694-ae4ff059ec1c\") " Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.363473 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cfd9070-b1bd-4a24-b694-ae4ff059ec1c-kube-api-access-pcr4j" (OuterVolumeSpecName: "kube-api-access-pcr4j") pod "8cfd9070-b1bd-4a24-b694-ae4ff059ec1c" (UID: "8cfd9070-b1bd-4a24-b694-ae4ff059ec1c"). InnerVolumeSpecName "kube-api-access-pcr4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.435904 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcr4j\" (UniqueName: \"kubernetes.io/projected/8cfd9070-b1bd-4a24-b694-ae4ff059ec1c-kube-api-access-pcr4j\") on node \"crc\" DevicePath \"\"" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.532434 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.532506 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.940606 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8cfd9070-b1bd-4a24-b694-ae4ff059ec1c","Type":"ContainerDied","Data":"d9b7eb50fc8002c0dd8c6db3eb15b084899f99b7a36986ec1b09c8a004727220"} Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.940670 4807 scope.go:117] "RemoveContainer" containerID="0e65cdd448122001009f7cd9fc198fd59b27840371789a1c62bddb5984f32655" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.940924 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.966853 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.986313 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.994740 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 11:28:59 crc kubenswrapper[4807]: E1127 11:28:59.995167 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cfd9070-b1bd-4a24-b694-ae4ff059ec1c" containerName="kube-state-metrics" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.995186 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfd9070-b1bd-4a24-b694-ae4ff059ec1c" containerName="kube-state-metrics" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.995386 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cfd9070-b1bd-4a24-b694-ae4ff059ec1c" containerName="kube-state-metrics" Nov 27 11:28:59 crc kubenswrapper[4807]: I1127 11:28:59.996112 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.002925 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.003829 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.004683 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.149025 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2744cb30-46c9-4f1e-a771-9bd30eefa50d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.149074 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2744cb30-46c9-4f1e-a771-9bd30eefa50d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.149112 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2744cb30-46c9-4f1e-a771-9bd30eefa50d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.149195 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b7xf\" (UniqueName: \"kubernetes.io/projected/2744cb30-46c9-4f1e-a771-9bd30eefa50d-kube-api-access-9b7xf\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.251041 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2744cb30-46c9-4f1e-a771-9bd30eefa50d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.252371 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2744cb30-46c9-4f1e-a771-9bd30eefa50d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.252522 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2744cb30-46c9-4f1e-a771-9bd30eefa50d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.252668 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b7xf\" (UniqueName: \"kubernetes.io/projected/2744cb30-46c9-4f1e-a771-9bd30eefa50d-kube-api-access-9b7xf\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.257411 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2744cb30-46c9-4f1e-a771-9bd30eefa50d-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.257562 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2744cb30-46c9-4f1e-a771-9bd30eefa50d-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.258608 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2744cb30-46c9-4f1e-a771-9bd30eefa50d-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.282443 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b7xf\" (UniqueName: \"kubernetes.io/projected/2744cb30-46c9-4f1e-a771-9bd30eefa50d-kube-api-access-9b7xf\") pod \"kube-state-metrics-0\" (UID: \"2744cb30-46c9-4f1e-a771-9bd30eefa50d\") " pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.316754 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.859420 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.964058 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2744cb30-46c9-4f1e-a771-9bd30eefa50d","Type":"ContainerStarted","Data":"2aa4be8dd850941df2d1bd5c63bcb90a8c6ae3aa69e76da908d116f1b9804c4a"} Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.977461 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.977806 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="ceilometer-central-agent" containerID="cri-o://b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c" gracePeriod=30 Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.977840 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="sg-core" containerID="cri-o://c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07" gracePeriod=30 Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.977901 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="proxy-httpd" containerID="cri-o://d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2" gracePeriod=30 Nov 27 11:29:00 crc kubenswrapper[4807]: I1127 11:29:00.977913 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="ceilometer-notification-agent" containerID="cri-o://8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24" gracePeriod=30 Nov 27 11:29:01 crc kubenswrapper[4807]: I1127 11:29:01.545369 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cfd9070-b1bd-4a24-b694-ae4ff059ec1c" path="/var/lib/kubelet/pods/8cfd9070-b1bd-4a24-b694-ae4ff059ec1c/volumes" Nov 27 11:29:01 crc kubenswrapper[4807]: I1127 11:29:01.973438 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2744cb30-46c9-4f1e-a771-9bd30eefa50d","Type":"ContainerStarted","Data":"1d4390d7d118e0e82e1564400710091995079cc68606fee683d6032cafe6a5b9"} Nov 27 11:29:01 crc kubenswrapper[4807]: I1127 11:29:01.973507 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 27 11:29:01 crc kubenswrapper[4807]: I1127 11:29:01.976346 4807 generic.go:334] "Generic (PLEG): container finished" podID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerID="d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2" exitCode=0 Nov 27 11:29:01 crc kubenswrapper[4807]: I1127 11:29:01.976383 4807 generic.go:334] "Generic (PLEG): container finished" podID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerID="c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07" exitCode=2 Nov 27 11:29:01 crc kubenswrapper[4807]: I1127 11:29:01.976396 4807 generic.go:334] "Generic (PLEG): container finished" podID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerID="b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c" exitCode=0 Nov 27 11:29:01 crc kubenswrapper[4807]: I1127 11:29:01.976416 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491","Type":"ContainerDied","Data":"d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2"} Nov 27 11:29:01 crc kubenswrapper[4807]: I1127 11:29:01.976443 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491","Type":"ContainerDied","Data":"c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07"} Nov 27 11:29:01 crc kubenswrapper[4807]: I1127 11:29:01.976453 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491","Type":"ContainerDied","Data":"b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c"} Nov 27 11:29:01 crc kubenswrapper[4807]: I1127 11:29:01.990831 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.6370699330000003 podStartE2EDuration="2.990810461s" podCreationTimestamp="2025-11-27 11:28:59 +0000 UTC" firstStartedPulling="2025-11-27 11:29:00.866131778 +0000 UTC m=+1181.965629976" lastFinishedPulling="2025-11-27 11:29:01.219872306 +0000 UTC m=+1182.319370504" observedRunningTime="2025-11-27 11:29:01.985128533 +0000 UTC m=+1183.084626741" watchObservedRunningTime="2025-11-27 11:29:01.990810461 +0000 UTC m=+1183.090308669" Nov 27 11:29:02 crc kubenswrapper[4807]: I1127 11:29:02.391195 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 11:29:02 crc kubenswrapper[4807]: I1127 11:29:02.391363 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 11:29:02 crc kubenswrapper[4807]: I1127 11:29:02.397659 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 11:29:02 crc kubenswrapper[4807]: I1127 11:29:02.399016 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.878872 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.987306 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-run-httpd\") pod \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.987429 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b4h4\" (UniqueName: \"kubernetes.io/projected/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-kube-api-access-8b4h4\") pod \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.987450 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-scripts\") pod \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.987471 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-sg-core-conf-yaml\") pod \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.987516 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-combined-ca-bundle\") pod \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.987606 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-log-httpd\") pod \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.987656 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-config-data\") pod \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\" (UID: \"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491\") " Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.987900 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" (UID: "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.988266 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.988967 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" (UID: "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.993723 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-scripts" (OuterVolumeSpecName: "scripts") pod "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" (UID: "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:04 crc kubenswrapper[4807]: I1127 11:29:04.993856 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-kube-api-access-8b4h4" (OuterVolumeSpecName: "kube-api-access-8b4h4") pod "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" (UID: "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491"). InnerVolumeSpecName "kube-api-access-8b4h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.006973 4807 generic.go:334] "Generic (PLEG): container finished" podID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerID="8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24" exitCode=0 Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.007014 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491","Type":"ContainerDied","Data":"8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24"} Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.007042 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d09b001-7ffd-4b2c-9ab0-3b53e54c6491","Type":"ContainerDied","Data":"f868ed3e5322ee8abd9fd7bb6382e65c3b8372e5c6501e3618b2dfa8c946bfec"} Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.007062 4807 scope.go:117] "RemoveContainer" containerID="d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.007444 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.019748 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" (UID: "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.067960 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" (UID: "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.090258 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.090434 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.090524 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b4h4\" (UniqueName: \"kubernetes.io/projected/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-kube-api-access-8b4h4\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.090635 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.090715 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.093410 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-config-data" (OuterVolumeSpecName: "config-data") pod "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" (UID: "5d09b001-7ffd-4b2c-9ab0-3b53e54c6491"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.098940 4807 scope.go:117] "RemoveContainer" containerID="c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.116336 4807 scope.go:117] "RemoveContainer" containerID="8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.134529 4807 scope.go:117] "RemoveContainer" containerID="b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.153871 4807 scope.go:117] "RemoveContainer" containerID="d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2" Nov 27 11:29:05 crc kubenswrapper[4807]: E1127 11:29:05.154323 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2\": container with ID starting with d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2 not found: ID does not exist" containerID="d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.154367 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2"} err="failed to get container status \"d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2\": rpc error: code = NotFound desc = could not find container \"d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2\": container with ID starting with d5fea5a01bd51b28f808cc30c4f51471c5a893e8bc323836de2b0b5da9a01bd2 not found: ID does not exist" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.154418 4807 scope.go:117] "RemoveContainer" containerID="c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07" Nov 27 11:29:05 crc kubenswrapper[4807]: E1127 11:29:05.154870 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07\": container with ID starting with c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07 not found: ID does not exist" containerID="c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.154902 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07"} err="failed to get container status \"c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07\": rpc error: code = NotFound desc = could not find container \"c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07\": container with ID starting with c9bea6942c23d826e34a12795c49bd4202c453fb4949bd2a1bf0dd93d00e3a07 not found: ID does not exist" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.154926 4807 scope.go:117] "RemoveContainer" containerID="8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24" Nov 27 11:29:05 crc kubenswrapper[4807]: E1127 11:29:05.155158 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24\": container with ID starting with 8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24 not found: ID does not exist" containerID="8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.155178 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24"} err="failed to get container status \"8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24\": rpc error: code = NotFound desc = could not find container \"8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24\": container with ID starting with 8a23a8ee5dd1f6a8806df277e0705a464bae9430241b61c6068100a71e54cb24 not found: ID does not exist" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.155190 4807 scope.go:117] "RemoveContainer" containerID="b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c" Nov 27 11:29:05 crc kubenswrapper[4807]: E1127 11:29:05.155542 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c\": container with ID starting with b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c not found: ID does not exist" containerID="b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.155559 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c"} err="failed to get container status \"b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c\": rpc error: code = NotFound desc = could not find container \"b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c\": container with ID starting with b6531669e70229606721c9e5aa00518833f15780b6d6e474d44365fbf4b0876c not found: ID does not exist" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.192435 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.338202 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.346960 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.365067 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:05 crc kubenswrapper[4807]: E1127 11:29:05.365459 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="sg-core" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.365478 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="sg-core" Nov 27 11:29:05 crc kubenswrapper[4807]: E1127 11:29:05.365510 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="ceilometer-notification-agent" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.365519 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="ceilometer-notification-agent" Nov 27 11:29:05 crc kubenswrapper[4807]: E1127 11:29:05.365535 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="proxy-httpd" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.365541 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="proxy-httpd" Nov 27 11:29:05 crc kubenswrapper[4807]: E1127 11:29:05.365549 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="ceilometer-central-agent" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.365556 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="ceilometer-central-agent" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.365711 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="proxy-httpd" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.365729 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="ceilometer-central-agent" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.365746 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="sg-core" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.365762 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" containerName="ceilometer-notification-agent" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.367638 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.370849 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.374929 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.375714 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.381797 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.500277 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-scripts\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.500347 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-run-httpd\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.500408 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-config-data\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.500439 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-log-httpd\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.500482 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.500517 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.500532 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.500555 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k5rw\" (UniqueName: \"kubernetes.io/projected/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-kube-api-access-6k5rw\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.543675 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d09b001-7ffd-4b2c-9ab0-3b53e54c6491" path="/var/lib/kubelet/pods/5d09b001-7ffd-4b2c-9ab0-3b53e54c6491/volumes" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.602426 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-run-httpd\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.602674 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-config-data\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.602814 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-log-httpd\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.603016 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.603184 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.603366 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.603495 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k5rw\" (UniqueName: \"kubernetes.io/projected/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-kube-api-access-6k5rw\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.603662 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-scripts\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.604271 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-run-httpd\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.604410 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-log-httpd\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.607750 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.607767 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.608846 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-scripts\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.610319 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.610384 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-config-data\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.620452 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k5rw\" (UniqueName: \"kubernetes.io/projected/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-kube-api-access-6k5rw\") pod \"ceilometer-0\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.685906 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:29:05 crc kubenswrapper[4807]: I1127 11:29:05.969839 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.016281 4807 generic.go:334] "Generic (PLEG): container finished" podID="b38c64fe-fd37-446f-a0fc-e110a5904b22" containerID="c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad" exitCode=137 Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.016337 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.016333 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b38c64fe-fd37-446f-a0fc-e110a5904b22","Type":"ContainerDied","Data":"c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad"} Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.016480 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b38c64fe-fd37-446f-a0fc-e110a5904b22","Type":"ContainerDied","Data":"7017aee7e12d0bae7e84f545ce509cd32112c0fc54420c8e7e63dca09c64ec62"} Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.016573 4807 scope.go:117] "RemoveContainer" containerID="c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.040956 4807 scope.go:117] "RemoveContainer" containerID="c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad" Nov 27 11:29:06 crc kubenswrapper[4807]: E1127 11:29:06.041342 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad\": container with ID starting with c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad not found: ID does not exist" containerID="c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.041389 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad"} err="failed to get container status \"c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad\": rpc error: code = NotFound desc = could not find container \"c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad\": container with ID starting with c3dda1dae71b581417991de5e983fe83e264e73be603d42e6e17c081f9942dad not found: ID does not exist" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.114511 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjpph\" (UniqueName: \"kubernetes.io/projected/b38c64fe-fd37-446f-a0fc-e110a5904b22-kube-api-access-sjpph\") pod \"b38c64fe-fd37-446f-a0fc-e110a5904b22\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.114798 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-combined-ca-bundle\") pod \"b38c64fe-fd37-446f-a0fc-e110a5904b22\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.114891 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-config-data\") pod \"b38c64fe-fd37-446f-a0fc-e110a5904b22\" (UID: \"b38c64fe-fd37-446f-a0fc-e110a5904b22\") " Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.122632 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b38c64fe-fd37-446f-a0fc-e110a5904b22-kube-api-access-sjpph" (OuterVolumeSpecName: "kube-api-access-sjpph") pod "b38c64fe-fd37-446f-a0fc-e110a5904b22" (UID: "b38c64fe-fd37-446f-a0fc-e110a5904b22"). InnerVolumeSpecName "kube-api-access-sjpph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.144805 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b38c64fe-fd37-446f-a0fc-e110a5904b22" (UID: "b38c64fe-fd37-446f-a0fc-e110a5904b22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.146827 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-config-data" (OuterVolumeSpecName: "config-data") pod "b38c64fe-fd37-446f-a0fc-e110a5904b22" (UID: "b38c64fe-fd37-446f-a0fc-e110a5904b22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.177108 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:06 crc kubenswrapper[4807]: W1127 11:29:06.180725 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a5928a4_1dcc_4800_9fd7_9af0d18cfa5b.slice/crio-0a2b9e7d408eb29263b61f1a00f498a32df6718f8bb97ad7d0d04dabe369c7f7 WatchSource:0}: Error finding container 0a2b9e7d408eb29263b61f1a00f498a32df6718f8bb97ad7d0d04dabe369c7f7: Status 404 returned error can't find the container with id 0a2b9e7d408eb29263b61f1a00f498a32df6718f8bb97ad7d0d04dabe369c7f7 Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.218183 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjpph\" (UniqueName: \"kubernetes.io/projected/b38c64fe-fd37-446f-a0fc-e110a5904b22-kube-api-access-sjpph\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.218552 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.218578 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b38c64fe-fd37-446f-a0fc-e110a5904b22-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.377309 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.394708 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.410301 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 11:29:06 crc kubenswrapper[4807]: E1127 11:29:06.410804 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b38c64fe-fd37-446f-a0fc-e110a5904b22" containerName="nova-cell1-novncproxy-novncproxy" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.410831 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38c64fe-fd37-446f-a0fc-e110a5904b22" containerName="nova-cell1-novncproxy-novncproxy" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.411047 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b38c64fe-fd37-446f-a0fc-e110a5904b22" containerName="nova-cell1-novncproxy-novncproxy" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.411833 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.414074 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.414468 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.414603 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.423829 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.522539 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.522732 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.522778 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.522919 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.522954 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p5j2\" (UniqueName: \"kubernetes.io/projected/148fe221-9289-4661-9ac6-fa5eb6af9b7f-kube-api-access-2p5j2\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.624776 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.624879 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.624908 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.624991 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.625010 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p5j2\" (UniqueName: \"kubernetes.io/projected/148fe221-9289-4661-9ac6-fa5eb6af9b7f-kube-api-access-2p5j2\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.630089 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.630304 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.631468 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.631518 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/148fe221-9289-4661-9ac6-fa5eb6af9b7f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.643456 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p5j2\" (UniqueName: \"kubernetes.io/projected/148fe221-9289-4661-9ac6-fa5eb6af9b7f-kube-api-access-2p5j2\") pod \"nova-cell1-novncproxy-0\" (UID: \"148fe221-9289-4661-9ac6-fa5eb6af9b7f\") " pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:06 crc kubenswrapper[4807]: I1127 11:29:06.729865 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:07 crc kubenswrapper[4807]: I1127 11:29:07.027971 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b","Type":"ContainerStarted","Data":"c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83"} Nov 27 11:29:07 crc kubenswrapper[4807]: I1127 11:29:07.028311 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b","Type":"ContainerStarted","Data":"0a2b9e7d408eb29263b61f1a00f498a32df6718f8bb97ad7d0d04dabe369c7f7"} Nov 27 11:29:07 crc kubenswrapper[4807]: I1127 11:29:07.231625 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 27 11:29:07 crc kubenswrapper[4807]: W1127 11:29:07.234499 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod148fe221_9289_4661_9ac6_fa5eb6af9b7f.slice/crio-5268efb2380adf260b93435adce3aa316ac53a4f138be65ff6edbe0b031c53d3 WatchSource:0}: Error finding container 5268efb2380adf260b93435adce3aa316ac53a4f138be65ff6edbe0b031c53d3: Status 404 returned error can't find the container with id 5268efb2380adf260b93435adce3aa316ac53a4f138be65ff6edbe0b031c53d3 Nov 27 11:29:07 crc kubenswrapper[4807]: I1127 11:29:07.543208 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b38c64fe-fd37-446f-a0fc-e110a5904b22" path="/var/lib/kubelet/pods/b38c64fe-fd37-446f-a0fc-e110a5904b22/volumes" Nov 27 11:29:08 crc kubenswrapper[4807]: I1127 11:29:08.043622 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"148fe221-9289-4661-9ac6-fa5eb6af9b7f","Type":"ContainerStarted","Data":"5ee4a32ea56df3b745707d01b351481c1586e7eb1777badc9e03c92da9c2a7bc"} Nov 27 11:29:08 crc kubenswrapper[4807]: I1127 11:29:08.043665 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"148fe221-9289-4661-9ac6-fa5eb6af9b7f","Type":"ContainerStarted","Data":"5268efb2380adf260b93435adce3aa316ac53a4f138be65ff6edbe0b031c53d3"} Nov 27 11:29:08 crc kubenswrapper[4807]: I1127 11:29:08.046492 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b","Type":"ContainerStarted","Data":"1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31"} Nov 27 11:29:08 crc kubenswrapper[4807]: I1127 11:29:08.063167 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.063093111 podStartE2EDuration="2.063093111s" podCreationTimestamp="2025-11-27 11:29:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:29:08.062370772 +0000 UTC m=+1189.161868980" watchObservedRunningTime="2025-11-27 11:29:08.063093111 +0000 UTC m=+1189.162591349" Nov 27 11:29:08 crc kubenswrapper[4807]: I1127 11:29:08.449994 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 11:29:08 crc kubenswrapper[4807]: I1127 11:29:08.450822 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 11:29:08 crc kubenswrapper[4807]: I1127 11:29:08.454062 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 11:29:08 crc kubenswrapper[4807]: I1127 11:29:08.456286 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.055992 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b","Type":"ContainerStarted","Data":"96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431"} Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.057214 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.062568 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.234964 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-x7c26"] Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.236708 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.270724 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-x7c26"] Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.388464 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcxbj\" (UniqueName: \"kubernetes.io/projected/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-kube-api-access-jcxbj\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.388527 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-config\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.388667 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.388790 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.388826 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.389016 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.490223 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcxbj\" (UniqueName: \"kubernetes.io/projected/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-kube-api-access-jcxbj\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.490557 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-config\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.490623 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.490662 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.490679 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.490742 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.491567 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.491881 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.492097 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.492457 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-config\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.492653 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.511263 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcxbj\" (UniqueName: \"kubernetes.io/projected/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-kube-api-access-jcxbj\") pod \"dnsmasq-dns-89c5cd4d5-x7c26\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:09 crc kubenswrapper[4807]: I1127 11:29:09.571595 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:10 crc kubenswrapper[4807]: I1127 11:29:10.088260 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-x7c26"] Nov 27 11:29:10 crc kubenswrapper[4807]: W1127 11:29:10.093334 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58a3e38f_7973_4aaf_8c5b_2f5f85dc1cc0.slice/crio-3616d5eb225e4bdd9d0304ddffc2b099f8c2f9803ab9ae4388a63a407a7f5808 WatchSource:0}: Error finding container 3616d5eb225e4bdd9d0304ddffc2b099f8c2f9803ab9ae4388a63a407a7f5808: Status 404 returned error can't find the container with id 3616d5eb225e4bdd9d0304ddffc2b099f8c2f9803ab9ae4388a63a407a7f5808 Nov 27 11:29:10 crc kubenswrapper[4807]: I1127 11:29:10.335525 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 27 11:29:11 crc kubenswrapper[4807]: I1127 11:29:11.078447 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b","Type":"ContainerStarted","Data":"1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182"} Nov 27 11:29:11 crc kubenswrapper[4807]: I1127 11:29:11.079019 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 11:29:11 crc kubenswrapper[4807]: I1127 11:29:11.080618 4807 generic.go:334] "Generic (PLEG): container finished" podID="58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" containerID="1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f" exitCode=0 Nov 27 11:29:11 crc kubenswrapper[4807]: I1127 11:29:11.081905 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" event={"ID":"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0","Type":"ContainerDied","Data":"1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f"} Nov 27 11:29:11 crc kubenswrapper[4807]: I1127 11:29:11.082015 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" event={"ID":"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0","Type":"ContainerStarted","Data":"3616d5eb225e4bdd9d0304ddffc2b099f8c2f9803ab9ae4388a63a407a7f5808"} Nov 27 11:29:11 crc kubenswrapper[4807]: I1127 11:29:11.155754 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.327994584 podStartE2EDuration="6.155729803s" podCreationTimestamp="2025-11-27 11:29:05 +0000 UTC" firstStartedPulling="2025-11-27 11:29:06.185136346 +0000 UTC m=+1187.284634584" lastFinishedPulling="2025-11-27 11:29:10.012871605 +0000 UTC m=+1191.112369803" observedRunningTime="2025-11-27 11:29:11.111755984 +0000 UTC m=+1192.211254202" watchObservedRunningTime="2025-11-27 11:29:11.155729803 +0000 UTC m=+1192.255228001" Nov 27 11:29:11 crc kubenswrapper[4807]: I1127 11:29:11.415580 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:11 crc kubenswrapper[4807]: I1127 11:29:11.614792 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:11 crc kubenswrapper[4807]: I1127 11:29:11.731140 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:12 crc kubenswrapper[4807]: I1127 11:29:12.096017 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" event={"ID":"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0","Type":"ContainerStarted","Data":"c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479"} Nov 27 11:29:12 crc kubenswrapper[4807]: I1127 11:29:12.096219 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" containerName="nova-api-log" containerID="cri-o://9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e" gracePeriod=30 Nov 27 11:29:12 crc kubenswrapper[4807]: I1127 11:29:12.096506 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:12 crc kubenswrapper[4807]: I1127 11:29:12.096569 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" containerName="nova-api-api" containerID="cri-o://6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7" gracePeriod=30 Nov 27 11:29:12 crc kubenswrapper[4807]: I1127 11:29:12.127525 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" podStartSLOduration=3.127508399 podStartE2EDuration="3.127508399s" podCreationTimestamp="2025-11-27 11:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:29:12.122508908 +0000 UTC m=+1193.222007126" watchObservedRunningTime="2025-11-27 11:29:12.127508399 +0000 UTC m=+1193.227006587" Nov 27 11:29:13 crc kubenswrapper[4807]: I1127 11:29:13.104308 4807 generic.go:334] "Generic (PLEG): container finished" podID="65c54290-ae69-4c00-8968-3aceb71768c2" containerID="9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e" exitCode=143 Nov 27 11:29:13 crc kubenswrapper[4807]: I1127 11:29:13.104405 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65c54290-ae69-4c00-8968-3aceb71768c2","Type":"ContainerDied","Data":"9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e"} Nov 27 11:29:13 crc kubenswrapper[4807]: I1127 11:29:13.104978 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="ceilometer-central-agent" containerID="cri-o://c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83" gracePeriod=30 Nov 27 11:29:13 crc kubenswrapper[4807]: I1127 11:29:13.105064 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="ceilometer-notification-agent" containerID="cri-o://1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31" gracePeriod=30 Nov 27 11:29:13 crc kubenswrapper[4807]: I1127 11:29:13.105085 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="sg-core" containerID="cri-o://96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431" gracePeriod=30 Nov 27 11:29:13 crc kubenswrapper[4807]: I1127 11:29:13.105109 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="proxy-httpd" containerID="cri-o://1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182" gracePeriod=30 Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.122271 4807 generic.go:334] "Generic (PLEG): container finished" podID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerID="1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182" exitCode=0 Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.122578 4807 generic.go:334] "Generic (PLEG): container finished" podID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerID="96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431" exitCode=2 Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.122589 4807 generic.go:334] "Generic (PLEG): container finished" podID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerID="1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31" exitCode=0 Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.122302 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b","Type":"ContainerDied","Data":"1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182"} Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.122711 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b","Type":"ContainerDied","Data":"96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431"} Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.122740 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b","Type":"ContainerDied","Data":"1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31"} Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.476373 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.584498 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-ceilometer-tls-certs\") pod \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.584556 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-scripts\") pod \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.584610 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-log-httpd\") pod \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.584706 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-combined-ca-bundle\") pod \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.584752 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-config-data\") pod \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.584800 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k5rw\" (UniqueName: \"kubernetes.io/projected/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-kube-api-access-6k5rw\") pod \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.584822 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-sg-core-conf-yaml\") pod \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.584911 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-run-httpd\") pod \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\" (UID: \"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b\") " Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.585183 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" (UID: "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.585425 4807 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.585429 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" (UID: "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.589996 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-kube-api-access-6k5rw" (OuterVolumeSpecName: "kube-api-access-6k5rw") pod "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" (UID: "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b"). InnerVolumeSpecName "kube-api-access-6k5rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.590792 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-scripts" (OuterVolumeSpecName: "scripts") pod "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" (UID: "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.613427 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" (UID: "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.636453 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" (UID: "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.659628 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" (UID: "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.686689 4807 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.686715 4807 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.686726 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.686736 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.686744 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k5rw\" (UniqueName: \"kubernetes.io/projected/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-kube-api-access-6k5rw\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.686754 4807 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.688510 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-config-data" (OuterVolumeSpecName: "config-data") pod "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" (UID: "3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:14 crc kubenswrapper[4807]: I1127 11:29:14.788497 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.132760 4807 generic.go:334] "Generic (PLEG): container finished" podID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerID="c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83" exitCode=0 Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.132812 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b","Type":"ContainerDied","Data":"c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83"} Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.132890 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b","Type":"ContainerDied","Data":"0a2b9e7d408eb29263b61f1a00f498a32df6718f8bb97ad7d0d04dabe369c7f7"} Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.132909 4807 scope.go:117] "RemoveContainer" containerID="1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.132849 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.151325 4807 scope.go:117] "RemoveContainer" containerID="96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.173376 4807 scope.go:117] "RemoveContainer" containerID="1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.180639 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.200481 4807 scope.go:117] "RemoveContainer" containerID="c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.201577 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.214853 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:15 crc kubenswrapper[4807]: E1127 11:29:15.215480 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="ceilometer-central-agent" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.215550 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="ceilometer-central-agent" Nov 27 11:29:15 crc kubenswrapper[4807]: E1127 11:29:15.215808 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="sg-core" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.215888 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="sg-core" Nov 27 11:29:15 crc kubenswrapper[4807]: E1127 11:29:15.216218 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="proxy-httpd" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.216304 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="proxy-httpd" Nov 27 11:29:15 crc kubenswrapper[4807]: E1127 11:29:15.216369 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="ceilometer-notification-agent" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.216427 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="ceilometer-notification-agent" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.216656 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="proxy-httpd" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.216720 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="sg-core" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.216783 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="ceilometer-central-agent" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.216843 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" containerName="ceilometer-notification-agent" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.218494 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.220508 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.220768 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.221537 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.241122 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.246796 4807 scope.go:117] "RemoveContainer" containerID="1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182" Nov 27 11:29:15 crc kubenswrapper[4807]: E1127 11:29:15.247272 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182\": container with ID starting with 1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182 not found: ID does not exist" containerID="1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.247328 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182"} err="failed to get container status \"1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182\": rpc error: code = NotFound desc = could not find container \"1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182\": container with ID starting with 1a3fa4eee5aa3b90194829a77d1b890bce124ad158ec854d96344692d8d71182 not found: ID does not exist" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.247361 4807 scope.go:117] "RemoveContainer" containerID="96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431" Nov 27 11:29:15 crc kubenswrapper[4807]: E1127 11:29:15.247860 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431\": container with ID starting with 96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431 not found: ID does not exist" containerID="96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.247887 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431"} err="failed to get container status \"96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431\": rpc error: code = NotFound desc = could not find container \"96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431\": container with ID starting with 96ef7c00b788df030a90f1cb3cc5c7fa0a26a528829bee4b066f13bbff4f4431 not found: ID does not exist" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.247903 4807 scope.go:117] "RemoveContainer" containerID="1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31" Nov 27 11:29:15 crc kubenswrapper[4807]: E1127 11:29:15.248189 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31\": container with ID starting with 1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31 not found: ID does not exist" containerID="1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.248284 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31"} err="failed to get container status \"1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31\": rpc error: code = NotFound desc = could not find container \"1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31\": container with ID starting with 1b7471af4cc9f698acd62ae78a2631e198f8f5d5bad59094089ae5aa0f709f31 not found: ID does not exist" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.248318 4807 scope.go:117] "RemoveContainer" containerID="c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83" Nov 27 11:29:15 crc kubenswrapper[4807]: E1127 11:29:15.248633 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83\": container with ID starting with c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83 not found: ID does not exist" containerID="c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.248657 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83"} err="failed to get container status \"c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83\": rpc error: code = NotFound desc = could not find container \"c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83\": container with ID starting with c54cec701a61b746e0afdb65c022ebd964cb4f32440a20e63300f24a46585b83 not found: ID does not exist" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.297489 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.297907 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-log-httpd\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.298304 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.298422 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-scripts\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.298524 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.298662 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lsfx\" (UniqueName: \"kubernetes.io/projected/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-kube-api-access-9lsfx\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.299315 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-run-httpd\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.299449 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-config-data\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.401183 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.401256 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-log-httpd\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.401290 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.401331 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-scripts\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.401368 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.401423 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lsfx\" (UniqueName: \"kubernetes.io/projected/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-kube-api-access-9lsfx\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.401444 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-run-httpd\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.401465 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-config-data\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.401907 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-log-httpd\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.402132 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-run-httpd\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.404866 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.406379 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-config-data\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.406758 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.407639 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.419744 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-scripts\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.422075 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lsfx\" (UniqueName: \"kubernetes.io/projected/2310e932-c289-4fe8-a5f9-ee9ce3ce915b-kube-api-access-9lsfx\") pod \"ceilometer-0\" (UID: \"2310e932-c289-4fe8-a5f9-ee9ce3ce915b\") " pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.541552 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b" path="/var/lib/kubelet/pods/3a5928a4-1dcc-4800-9fd7-9af0d18cfa5b/volumes" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.545712 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.976222 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 27 11:29:15 crc kubenswrapper[4807]: W1127 11:29:15.979214 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2310e932_c289_4fe8_a5f9_ee9ce3ce915b.slice/crio-5ebfed0de7d61202f0d60a0bb67528f1487d8cbd90c665b7bad17b829e90f048 WatchSource:0}: Error finding container 5ebfed0de7d61202f0d60a0bb67528f1487d8cbd90c665b7bad17b829e90f048: Status 404 returned error can't find the container with id 5ebfed0de7d61202f0d60a0bb67528f1487d8cbd90c665b7bad17b829e90f048 Nov 27 11:29:15 crc kubenswrapper[4807]: I1127 11:29:15.981555 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.007521 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.011984 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-config-data\") pod \"65c54290-ae69-4c00-8968-3aceb71768c2\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.012128 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkngz\" (UniqueName: \"kubernetes.io/projected/65c54290-ae69-4c00-8968-3aceb71768c2-kube-api-access-lkngz\") pod \"65c54290-ae69-4c00-8968-3aceb71768c2\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.012218 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-combined-ca-bundle\") pod \"65c54290-ae69-4c00-8968-3aceb71768c2\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.012265 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65c54290-ae69-4c00-8968-3aceb71768c2-logs\") pod \"65c54290-ae69-4c00-8968-3aceb71768c2\" (UID: \"65c54290-ae69-4c00-8968-3aceb71768c2\") " Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.013006 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65c54290-ae69-4c00-8968-3aceb71768c2-logs" (OuterVolumeSpecName: "logs") pod "65c54290-ae69-4c00-8968-3aceb71768c2" (UID: "65c54290-ae69-4c00-8968-3aceb71768c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.017060 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c54290-ae69-4c00-8968-3aceb71768c2-kube-api-access-lkngz" (OuterVolumeSpecName: "kube-api-access-lkngz") pod "65c54290-ae69-4c00-8968-3aceb71768c2" (UID: "65c54290-ae69-4c00-8968-3aceb71768c2"). InnerVolumeSpecName "kube-api-access-lkngz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.067463 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65c54290-ae69-4c00-8968-3aceb71768c2" (UID: "65c54290-ae69-4c00-8968-3aceb71768c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.073069 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-config-data" (OuterVolumeSpecName: "config-data") pod "65c54290-ae69-4c00-8968-3aceb71768c2" (UID: "65c54290-ae69-4c00-8968-3aceb71768c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.114312 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.114344 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkngz\" (UniqueName: \"kubernetes.io/projected/65c54290-ae69-4c00-8968-3aceb71768c2-kube-api-access-lkngz\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.114353 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65c54290-ae69-4c00-8968-3aceb71768c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.114362 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65c54290-ae69-4c00-8968-3aceb71768c2-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.140999 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2310e932-c289-4fe8-a5f9-ee9ce3ce915b","Type":"ContainerStarted","Data":"5ebfed0de7d61202f0d60a0bb67528f1487d8cbd90c665b7bad17b829e90f048"} Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.142968 4807 generic.go:334] "Generic (PLEG): container finished" podID="65c54290-ae69-4c00-8968-3aceb71768c2" containerID="6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7" exitCode=0 Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.143009 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65c54290-ae69-4c00-8968-3aceb71768c2","Type":"ContainerDied","Data":"6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7"} Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.143025 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65c54290-ae69-4c00-8968-3aceb71768c2","Type":"ContainerDied","Data":"3777ef085d75a6c5bd1cbe578fea69e4f86df4fb3664af97e61b3266d13ab0e4"} Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.143043 4807 scope.go:117] "RemoveContainer" containerID="6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.143123 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.174331 4807 scope.go:117] "RemoveContainer" containerID="9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.178462 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.188224 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.197754 4807 scope.go:117] "RemoveContainer" containerID="6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7" Nov 27 11:29:16 crc kubenswrapper[4807]: E1127 11:29:16.199703 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7\": container with ID starting with 6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7 not found: ID does not exist" containerID="6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.199763 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7"} err="failed to get container status \"6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7\": rpc error: code = NotFound desc = could not find container \"6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7\": container with ID starting with 6f52bd38850acf7dda375ad250259c439cb863ba86660f24aa0ec05382305ec7 not found: ID does not exist" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.199791 4807 scope.go:117] "RemoveContainer" containerID="9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e" Nov 27 11:29:16 crc kubenswrapper[4807]: E1127 11:29:16.202641 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e\": container with ID starting with 9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e not found: ID does not exist" containerID="9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.202679 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e"} err="failed to get container status \"9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e\": rpc error: code = NotFound desc = could not find container \"9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e\": container with ID starting with 9e788c066c7f2c23ffab074ade8a74c0c230f6c5fa4798f022d0911373e76d6e not found: ID does not exist" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.206109 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:16 crc kubenswrapper[4807]: E1127 11:29:16.206510 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" containerName="nova-api-log" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.206527 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" containerName="nova-api-log" Nov 27 11:29:16 crc kubenswrapper[4807]: E1127 11:29:16.206538 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" containerName="nova-api-api" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.206544 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" containerName="nova-api-api" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.206718 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" containerName="nova-api-api" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.206745 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" containerName="nova-api-log" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.207660 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.211826 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.211991 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.212219 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.216055 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6ssr\" (UniqueName: \"kubernetes.io/projected/6efee299-b321-47e6-9969-bc94a7f3ccbe-kube-api-access-k6ssr\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.216096 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.216143 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-public-tls-certs\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.216172 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.216289 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-config-data\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.216329 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6efee299-b321-47e6-9969-bc94a7f3ccbe-logs\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.242185 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.318195 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6ssr\" (UniqueName: \"kubernetes.io/projected/6efee299-b321-47e6-9969-bc94a7f3ccbe-kube-api-access-k6ssr\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.318262 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.318309 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-public-tls-certs\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.318344 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.318385 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-config-data\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.318427 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6efee299-b321-47e6-9969-bc94a7f3ccbe-logs\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.318976 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6efee299-b321-47e6-9969-bc94a7f3ccbe-logs\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.321676 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-public-tls-certs\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.322498 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.322807 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-config-data\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.324854 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.337106 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6ssr\" (UniqueName: \"kubernetes.io/projected/6efee299-b321-47e6-9969-bc94a7f3ccbe-kube-api-access-k6ssr\") pod \"nova-api-0\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.528496 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.730985 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.753737 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:16 crc kubenswrapper[4807]: I1127 11:29:16.980686 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.157182 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6efee299-b321-47e6-9969-bc94a7f3ccbe","Type":"ContainerStarted","Data":"d4af394fa6a0dfd1896e3cc824e00652f4fb5b0c9177a3cfec5be32516e2df30"} Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.157227 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6efee299-b321-47e6-9969-bc94a7f3ccbe","Type":"ContainerStarted","Data":"fdac42fd786c1afdd98ba7811401343469293b98e55aa366d5a87a7a0ab010ae"} Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.160843 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2310e932-c289-4fe8-a5f9-ee9ce3ce915b","Type":"ContainerStarted","Data":"4f6e0baa4700d2922a9ea1afb529f0813d6c73246e05b3738d6c5bdfead192bf"} Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.179172 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.379859 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hwsgz"] Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.382451 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.384798 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.384962 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.391370 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hwsgz"] Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.441291 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbls\" (UniqueName: \"kubernetes.io/projected/91906330-92d8-46f3-97a8-a7b2cfd31d6c-kube-api-access-cvbls\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.441439 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.441511 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-scripts\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.441552 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-config-data\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.543005 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-config-data\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.543095 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbls\" (UniqueName: \"kubernetes.io/projected/91906330-92d8-46f3-97a8-a7b2cfd31d6c-kube-api-access-cvbls\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.543176 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.543219 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-scripts\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.549652 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.553017 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-scripts\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.558368 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65c54290-ae69-4c00-8968-3aceb71768c2" path="/var/lib/kubelet/pods/65c54290-ae69-4c00-8968-3aceb71768c2/volumes" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.562019 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-config-data\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.574017 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbls\" (UniqueName: \"kubernetes.io/projected/91906330-92d8-46f3-97a8-a7b2cfd31d6c-kube-api-access-cvbls\") pod \"nova-cell1-cell-mapping-hwsgz\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:17 crc kubenswrapper[4807]: I1127 11:29:17.699659 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:18 crc kubenswrapper[4807]: I1127 11:29:18.174588 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6efee299-b321-47e6-9969-bc94a7f3ccbe","Type":"ContainerStarted","Data":"9e44bd07f21ba758446244e265b6587d4fed3eb08369600bc706327948f154c8"} Nov 27 11:29:18 crc kubenswrapper[4807]: I1127 11:29:18.178314 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2310e932-c289-4fe8-a5f9-ee9ce3ce915b","Type":"ContainerStarted","Data":"33152a21730120ac32110340704188b3134942c68f9a29dae2cf50222098866e"} Nov 27 11:29:18 crc kubenswrapper[4807]: I1127 11:29:18.192745 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hwsgz"] Nov 27 11:29:18 crc kubenswrapper[4807]: I1127 11:29:18.207841 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.207826359 podStartE2EDuration="2.207826359s" podCreationTimestamp="2025-11-27 11:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:29:18.204356079 +0000 UTC m=+1199.303854277" watchObservedRunningTime="2025-11-27 11:29:18.207826359 +0000 UTC m=+1199.307324557" Nov 27 11:29:19 crc kubenswrapper[4807]: I1127 11:29:19.193236 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hwsgz" event={"ID":"91906330-92d8-46f3-97a8-a7b2cfd31d6c","Type":"ContainerStarted","Data":"bf8b13017a4dd3696404757dc4f99cea793ecd72143d5a03ea2f71ea0fa9603e"} Nov 27 11:29:19 crc kubenswrapper[4807]: I1127 11:29:19.193510 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hwsgz" event={"ID":"91906330-92d8-46f3-97a8-a7b2cfd31d6c","Type":"ContainerStarted","Data":"1480a40aceb40c72b61a1b1e877d9e2a15c7c7e1b2195c47a7af1f1d0882436d"} Nov 27 11:29:19 crc kubenswrapper[4807]: I1127 11:29:19.196117 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2310e932-c289-4fe8-a5f9-ee9ce3ce915b","Type":"ContainerStarted","Data":"bdfb3d47529c8f60f3d5efc23399919da97128016c790d94403486db96936b09"} Nov 27 11:29:19 crc kubenswrapper[4807]: I1127 11:29:19.212797 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hwsgz" podStartSLOduration=2.212784283 podStartE2EDuration="2.212784283s" podCreationTimestamp="2025-11-27 11:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:29:19.205911153 +0000 UTC m=+1200.305409351" watchObservedRunningTime="2025-11-27 11:29:19.212784283 +0000 UTC m=+1200.312282471" Nov 27 11:29:19 crc kubenswrapper[4807]: I1127 11:29:19.576364 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:29:19 crc kubenswrapper[4807]: I1127 11:29:19.668041 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qvc29"] Nov 27 11:29:19 crc kubenswrapper[4807]: I1127 11:29:19.668410 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" podUID="4862f897-1024-446a-bc1c-9b8c9c8e7792" containerName="dnsmasq-dns" containerID="cri-o://e53072ecd3e796bc06b4588e2133c288edd5bec0d412c409df34c7db9aec7b44" gracePeriod=10 Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.212153 4807 generic.go:334] "Generic (PLEG): container finished" podID="4862f897-1024-446a-bc1c-9b8c9c8e7792" containerID="e53072ecd3e796bc06b4588e2133c288edd5bec0d412c409df34c7db9aec7b44" exitCode=0 Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.212528 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" event={"ID":"4862f897-1024-446a-bc1c-9b8c9c8e7792","Type":"ContainerDied","Data":"e53072ecd3e796bc06b4588e2133c288edd5bec0d412c409df34c7db9aec7b44"} Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.212570 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" event={"ID":"4862f897-1024-446a-bc1c-9b8c9c8e7792","Type":"ContainerDied","Data":"0e1c32ac745999a6ac664e30ea3ac386f054dded4e56a31a6acb09e9eb001935"} Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.212583 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e1c32ac745999a6ac664e30ea3ac386f054dded4e56a31a6acb09e9eb001935" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.229227 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2310e932-c289-4fe8-a5f9-ee9ce3ce915b","Type":"ContainerStarted","Data":"5b48b73699f74a28354fc028f792afb28c8f1066d21a9f07e9e752e8ff0fbda6"} Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.229713 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.263465 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.491506079 podStartE2EDuration="5.263444031s" podCreationTimestamp="2025-11-27 11:29:15 +0000 UTC" firstStartedPulling="2025-11-27 11:29:15.981350682 +0000 UTC m=+1197.080848880" lastFinishedPulling="2025-11-27 11:29:19.753288634 +0000 UTC m=+1200.852786832" observedRunningTime="2025-11-27 11:29:20.252365181 +0000 UTC m=+1201.351863389" watchObservedRunningTime="2025-11-27 11:29:20.263444031 +0000 UTC m=+1201.362942229" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.398866 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-swift-storage-0\") pod \"4862f897-1024-446a-bc1c-9b8c9c8e7792\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.399053 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-svc\") pod \"4862f897-1024-446a-bc1c-9b8c9c8e7792\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.399135 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mncmg\" (UniqueName: \"kubernetes.io/projected/4862f897-1024-446a-bc1c-9b8c9c8e7792-kube-api-access-mncmg\") pod \"4862f897-1024-446a-bc1c-9b8c9c8e7792\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.399216 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-nb\") pod \"4862f897-1024-446a-bc1c-9b8c9c8e7792\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.399260 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-sb\") pod \"4862f897-1024-446a-bc1c-9b8c9c8e7792\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.399318 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-config\") pod \"4862f897-1024-446a-bc1c-9b8c9c8e7792\" (UID: \"4862f897-1024-446a-bc1c-9b8c9c8e7792\") " Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.404733 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4862f897-1024-446a-bc1c-9b8c9c8e7792-kube-api-access-mncmg" (OuterVolumeSpecName: "kube-api-access-mncmg") pod "4862f897-1024-446a-bc1c-9b8c9c8e7792" (UID: "4862f897-1024-446a-bc1c-9b8c9c8e7792"). InnerVolumeSpecName "kube-api-access-mncmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.456855 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4862f897-1024-446a-bc1c-9b8c9c8e7792" (UID: "4862f897-1024-446a-bc1c-9b8c9c8e7792"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.463384 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-config" (OuterVolumeSpecName: "config") pod "4862f897-1024-446a-bc1c-9b8c9c8e7792" (UID: "4862f897-1024-446a-bc1c-9b8c9c8e7792"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.465067 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4862f897-1024-446a-bc1c-9b8c9c8e7792" (UID: "4862f897-1024-446a-bc1c-9b8c9c8e7792"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.468823 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4862f897-1024-446a-bc1c-9b8c9c8e7792" (UID: "4862f897-1024-446a-bc1c-9b8c9c8e7792"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.482755 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4862f897-1024-446a-bc1c-9b8c9c8e7792" (UID: "4862f897-1024-446a-bc1c-9b8c9c8e7792"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.501830 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.501862 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mncmg\" (UniqueName: \"kubernetes.io/projected/4862f897-1024-446a-bc1c-9b8c9c8e7792-kube-api-access-mncmg\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.501873 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.501881 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.501889 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.501896 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4862f897-1024-446a-bc1c-9b8c9c8e7792-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.921675 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:29:20 crc kubenswrapper[4807]: I1127 11:29:20.921737 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:29:21 crc kubenswrapper[4807]: I1127 11:29:21.239261 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" Nov 27 11:29:21 crc kubenswrapper[4807]: I1127 11:29:21.240027 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 27 11:29:21 crc kubenswrapper[4807]: I1127 11:29:21.284699 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qvc29"] Nov 27 11:29:21 crc kubenswrapper[4807]: I1127 11:29:21.292564 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-qvc29"] Nov 27 11:29:21 crc kubenswrapper[4807]: I1127 11:29:21.542106 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4862f897-1024-446a-bc1c-9b8c9c8e7792" path="/var/lib/kubelet/pods/4862f897-1024-446a-bc1c-9b8c9c8e7792/volumes" Nov 27 11:29:24 crc kubenswrapper[4807]: I1127 11:29:24.277596 4807 generic.go:334] "Generic (PLEG): container finished" podID="91906330-92d8-46f3-97a8-a7b2cfd31d6c" containerID="bf8b13017a4dd3696404757dc4f99cea793ecd72143d5a03ea2f71ea0fa9603e" exitCode=0 Nov 27 11:29:24 crc kubenswrapper[4807]: I1127 11:29:24.277665 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hwsgz" event={"ID":"91906330-92d8-46f3-97a8-a7b2cfd31d6c","Type":"ContainerDied","Data":"bf8b13017a4dd3696404757dc4f99cea793ecd72143d5a03ea2f71ea0fa9603e"} Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.003229 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-qvc29" podUID="4862f897-1024-446a-bc1c-9b8c9c8e7792" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.184:5353: i/o timeout" Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.662116 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.833174 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-config-data\") pod \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.833231 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-combined-ca-bundle\") pod \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.833458 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-scripts\") pod \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.833499 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbls\" (UniqueName: \"kubernetes.io/projected/91906330-92d8-46f3-97a8-a7b2cfd31d6c-kube-api-access-cvbls\") pod \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\" (UID: \"91906330-92d8-46f3-97a8-a7b2cfd31d6c\") " Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.838716 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91906330-92d8-46f3-97a8-a7b2cfd31d6c-kube-api-access-cvbls" (OuterVolumeSpecName: "kube-api-access-cvbls") pod "91906330-92d8-46f3-97a8-a7b2cfd31d6c" (UID: "91906330-92d8-46f3-97a8-a7b2cfd31d6c"). InnerVolumeSpecName "kube-api-access-cvbls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.838816 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-scripts" (OuterVolumeSpecName: "scripts") pod "91906330-92d8-46f3-97a8-a7b2cfd31d6c" (UID: "91906330-92d8-46f3-97a8-a7b2cfd31d6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.864094 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-config-data" (OuterVolumeSpecName: "config-data") pod "91906330-92d8-46f3-97a8-a7b2cfd31d6c" (UID: "91906330-92d8-46f3-97a8-a7b2cfd31d6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.870059 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91906330-92d8-46f3-97a8-a7b2cfd31d6c" (UID: "91906330-92d8-46f3-97a8-a7b2cfd31d6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.934980 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.935010 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.935021 4807 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91906330-92d8-46f3-97a8-a7b2cfd31d6c-scripts\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:25 crc kubenswrapper[4807]: I1127 11:29:25.935029 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvbls\" (UniqueName: \"kubernetes.io/projected/91906330-92d8-46f3-97a8-a7b2cfd31d6c-kube-api-access-cvbls\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.294589 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hwsgz" event={"ID":"91906330-92d8-46f3-97a8-a7b2cfd31d6c","Type":"ContainerDied","Data":"1480a40aceb40c72b61a1b1e877d9e2a15c7c7e1b2195c47a7af1f1d0882436d"} Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.294625 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1480a40aceb40c72b61a1b1e877d9e2a15c7c7e1b2195c47a7af1f1d0882436d" Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.294701 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hwsgz" Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.494576 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.494824 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d8515dce-7827-4661-ad4b-ca66a948eeeb" containerName="nova-scheduler-scheduler" containerID="cri-o://ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13" gracePeriod=30 Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.503570 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.503882 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6efee299-b321-47e6-9969-bc94a7f3ccbe" containerName="nova-api-log" containerID="cri-o://d4af394fa6a0dfd1896e3cc824e00652f4fb5b0c9177a3cfec5be32516e2df30" gracePeriod=30 Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.504040 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6efee299-b321-47e6-9969-bc94a7f3ccbe" containerName="nova-api-api" containerID="cri-o://9e44bd07f21ba758446244e265b6587d4fed3eb08369600bc706327948f154c8" gracePeriod=30 Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.526660 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.526910 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-log" containerID="cri-o://b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60" gracePeriod=30 Nov 27 11:29:26 crc kubenswrapper[4807]: I1127 11:29:26.526993 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-metadata" containerID="cri-o://94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4" gracePeriod=30 Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.303979 4807 generic.go:334] "Generic (PLEG): container finished" podID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerID="b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60" exitCode=143 Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.304297 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4","Type":"ContainerDied","Data":"b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60"} Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.306386 4807 generic.go:334] "Generic (PLEG): container finished" podID="6efee299-b321-47e6-9969-bc94a7f3ccbe" containerID="9e44bd07f21ba758446244e265b6587d4fed3eb08369600bc706327948f154c8" exitCode=0 Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.306406 4807 generic.go:334] "Generic (PLEG): container finished" podID="6efee299-b321-47e6-9969-bc94a7f3ccbe" containerID="d4af394fa6a0dfd1896e3cc824e00652f4fb5b0c9177a3cfec5be32516e2df30" exitCode=143 Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.306422 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6efee299-b321-47e6-9969-bc94a7f3ccbe","Type":"ContainerDied","Data":"9e44bd07f21ba758446244e265b6587d4fed3eb08369600bc706327948f154c8"} Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.306438 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6efee299-b321-47e6-9969-bc94a7f3ccbe","Type":"ContainerDied","Data":"d4af394fa6a0dfd1896e3cc824e00652f4fb5b0c9177a3cfec5be32516e2df30"} Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.394038 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.561896 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-internal-tls-certs\") pod \"6efee299-b321-47e6-9969-bc94a7f3ccbe\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.561937 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-config-data\") pod \"6efee299-b321-47e6-9969-bc94a7f3ccbe\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.562001 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6efee299-b321-47e6-9969-bc94a7f3ccbe-logs\") pod \"6efee299-b321-47e6-9969-bc94a7f3ccbe\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.562028 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-combined-ca-bundle\") pod \"6efee299-b321-47e6-9969-bc94a7f3ccbe\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.562048 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6ssr\" (UniqueName: \"kubernetes.io/projected/6efee299-b321-47e6-9969-bc94a7f3ccbe-kube-api-access-k6ssr\") pod \"6efee299-b321-47e6-9969-bc94a7f3ccbe\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.562147 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-public-tls-certs\") pod \"6efee299-b321-47e6-9969-bc94a7f3ccbe\" (UID: \"6efee299-b321-47e6-9969-bc94a7f3ccbe\") " Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.562269 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6efee299-b321-47e6-9969-bc94a7f3ccbe-logs" (OuterVolumeSpecName: "logs") pod "6efee299-b321-47e6-9969-bc94a7f3ccbe" (UID: "6efee299-b321-47e6-9969-bc94a7f3ccbe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.562531 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6efee299-b321-47e6-9969-bc94a7f3ccbe-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.568420 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6efee299-b321-47e6-9969-bc94a7f3ccbe-kube-api-access-k6ssr" (OuterVolumeSpecName: "kube-api-access-k6ssr") pod "6efee299-b321-47e6-9969-bc94a7f3ccbe" (UID: "6efee299-b321-47e6-9969-bc94a7f3ccbe"). InnerVolumeSpecName "kube-api-access-k6ssr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.589963 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6efee299-b321-47e6-9969-bc94a7f3ccbe" (UID: "6efee299-b321-47e6-9969-bc94a7f3ccbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.598752 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-config-data" (OuterVolumeSpecName: "config-data") pod "6efee299-b321-47e6-9969-bc94a7f3ccbe" (UID: "6efee299-b321-47e6-9969-bc94a7f3ccbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.630958 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6efee299-b321-47e6-9969-bc94a7f3ccbe" (UID: "6efee299-b321-47e6-9969-bc94a7f3ccbe"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.632894 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6efee299-b321-47e6-9969-bc94a7f3ccbe" (UID: "6efee299-b321-47e6-9969-bc94a7f3ccbe"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.663877 4807 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.663913 4807 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.663922 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.663930 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6efee299-b321-47e6-9969-bc94a7f3ccbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:27 crc kubenswrapper[4807]: I1127 11:29:27.663941 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6ssr\" (UniqueName: \"kubernetes.io/projected/6efee299-b321-47e6-9969-bc94a7f3ccbe-kube-api-access-k6ssr\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:28 crc kubenswrapper[4807]: E1127 11:29:28.194497 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 11:29:28 crc kubenswrapper[4807]: E1127 11:29:28.196915 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 11:29:28 crc kubenswrapper[4807]: E1127 11:29:28.198578 4807 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 27 11:29:28 crc kubenswrapper[4807]: E1127 11:29:28.198616 4807 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d8515dce-7827-4661-ad4b-ca66a948eeeb" containerName="nova-scheduler-scheduler" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.319465 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6efee299-b321-47e6-9969-bc94a7f3ccbe","Type":"ContainerDied","Data":"fdac42fd786c1afdd98ba7811401343469293b98e55aa366d5a87a7a0ab010ae"} Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.319561 4807 scope.go:117] "RemoveContainer" containerID="9e44bd07f21ba758446244e265b6587d4fed3eb08369600bc706327948f154c8" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.319637 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.358490 4807 scope.go:117] "RemoveContainer" containerID="d4af394fa6a0dfd1896e3cc824e00652f4fb5b0c9177a3cfec5be32516e2df30" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.368954 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.394243 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.414806 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:28 crc kubenswrapper[4807]: E1127 11:29:28.415200 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6efee299-b321-47e6-9969-bc94a7f3ccbe" containerName="nova-api-log" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.415213 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6efee299-b321-47e6-9969-bc94a7f3ccbe" containerName="nova-api-log" Nov 27 11:29:28 crc kubenswrapper[4807]: E1127 11:29:28.415226 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91906330-92d8-46f3-97a8-a7b2cfd31d6c" containerName="nova-manage" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.415233 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="91906330-92d8-46f3-97a8-a7b2cfd31d6c" containerName="nova-manage" Nov 27 11:29:28 crc kubenswrapper[4807]: E1127 11:29:28.415268 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4862f897-1024-446a-bc1c-9b8c9c8e7792" containerName="dnsmasq-dns" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.415276 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4862f897-1024-446a-bc1c-9b8c9c8e7792" containerName="dnsmasq-dns" Nov 27 11:29:28 crc kubenswrapper[4807]: E1127 11:29:28.415289 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6efee299-b321-47e6-9969-bc94a7f3ccbe" containerName="nova-api-api" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.415295 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6efee299-b321-47e6-9969-bc94a7f3ccbe" containerName="nova-api-api" Nov 27 11:29:28 crc kubenswrapper[4807]: E1127 11:29:28.415306 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4862f897-1024-446a-bc1c-9b8c9c8e7792" containerName="init" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.415312 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4862f897-1024-446a-bc1c-9b8c9c8e7792" containerName="init" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.415492 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4862f897-1024-446a-bc1c-9b8c9c8e7792" containerName="dnsmasq-dns" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.415510 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6efee299-b321-47e6-9969-bc94a7f3ccbe" containerName="nova-api-api" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.415519 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6efee299-b321-47e6-9969-bc94a7f3ccbe" containerName="nova-api-log" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.415534 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="91906330-92d8-46f3-97a8-a7b2cfd31d6c" containerName="nova-manage" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.416480 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.416552 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.448754 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.448962 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.449061 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.584875 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.584987 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.585728 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsmsd\" (UniqueName: \"kubernetes.io/projected/a071e484-2dfb-4bef-a538-69770c7f5f56-kube-api-access-dsmsd\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.585908 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-config-data\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.586075 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-public-tls-certs\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.586233 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a071e484-2dfb-4bef-a538-69770c7f5f56-logs\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.687574 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsmsd\" (UniqueName: \"kubernetes.io/projected/a071e484-2dfb-4bef-a538-69770c7f5f56-kube-api-access-dsmsd\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.687614 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-config-data\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.687659 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-public-tls-certs\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.687701 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a071e484-2dfb-4bef-a538-69770c7f5f56-logs\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.687739 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.687820 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.688287 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a071e484-2dfb-4bef-a538-69770c7f5f56-logs\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.692562 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.693713 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-public-tls-certs\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.693950 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.698612 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a071e484-2dfb-4bef-a538-69770c7f5f56-config-data\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.713964 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsmsd\" (UniqueName: \"kubernetes.io/projected/a071e484-2dfb-4bef-a538-69770c7f5f56-kube-api-access-dsmsd\") pod \"nova-api-0\" (UID: \"a071e484-2dfb-4bef-a538-69770c7f5f56\") " pod="openstack/nova-api-0" Nov 27 11:29:28 crc kubenswrapper[4807]: I1127 11:29:28.766230 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 27 11:29:29 crc kubenswrapper[4807]: I1127 11:29:29.205722 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 27 11:29:29 crc kubenswrapper[4807]: W1127 11:29:29.211228 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda071e484_2dfb_4bef_a538_69770c7f5f56.slice/crio-608187e4baf6af416c5b6f5fec8306a7ebc893dde734ea1e015c631e9b4e1769 WatchSource:0}: Error finding container 608187e4baf6af416c5b6f5fec8306a7ebc893dde734ea1e015c631e9b4e1769: Status 404 returned error can't find the container with id 608187e4baf6af416c5b6f5fec8306a7ebc893dde734ea1e015c631e9b4e1769 Nov 27 11:29:29 crc kubenswrapper[4807]: I1127 11:29:29.331044 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a071e484-2dfb-4bef-a538-69770c7f5f56","Type":"ContainerStarted","Data":"608187e4baf6af416c5b6f5fec8306a7ebc893dde734ea1e015c631e9b4e1769"} Nov 27 11:29:29 crc kubenswrapper[4807]: I1127 11:29:29.544897 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6efee299-b321-47e6-9969-bc94a7f3ccbe" path="/var/lib/kubelet/pods/6efee299-b321-47e6-9969-bc94a7f3ccbe/volumes" Nov 27 11:29:29 crc kubenswrapper[4807]: I1127 11:29:29.655864 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:52354->10.217.0.189:8775: read: connection reset by peer" Nov 27 11:29:29 crc kubenswrapper[4807]: I1127 11:29:29.655864 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.189:8775/\": read tcp 10.217.0.2:52356->10.217.0.189:8775: read: connection reset by peer" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.078576 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.212008 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-nova-metadata-tls-certs\") pod \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.212149 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-logs\") pod \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.212192 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-combined-ca-bundle\") pod \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.212316 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52m4f\" (UniqueName: \"kubernetes.io/projected/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-kube-api-access-52m4f\") pod \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.212353 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-config-data\") pod \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\" (UID: \"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4\") " Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.213524 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-logs" (OuterVolumeSpecName: "logs") pod "80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" (UID: "80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.217373 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-kube-api-access-52m4f" (OuterVolumeSpecName: "kube-api-access-52m4f") pod "80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" (UID: "80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4"). InnerVolumeSpecName "kube-api-access-52m4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.251792 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-config-data" (OuterVolumeSpecName: "config-data") pod "80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" (UID: "80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.254994 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" (UID: "80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.273427 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" (UID: "80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.314695 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52m4f\" (UniqueName: \"kubernetes.io/projected/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-kube-api-access-52m4f\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.314983 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.314992 4807 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.315001 4807 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-logs\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.315010 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.343333 4807 generic.go:334] "Generic (PLEG): container finished" podID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerID="94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4" exitCode=0 Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.343410 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4","Type":"ContainerDied","Data":"94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4"} Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.343423 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.343461 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4","Type":"ContainerDied","Data":"db3381d1c55ce71d196ab014af475dae54a4b36fcdc954cf0e1c76dc862f029e"} Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.343480 4807 scope.go:117] "RemoveContainer" containerID="94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.345945 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a071e484-2dfb-4bef-a538-69770c7f5f56","Type":"ContainerStarted","Data":"17f0629c2415453c4b319a649ba5c4d7962da7191bcd3f9341e50a77a7991ddf"} Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.345986 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a071e484-2dfb-4bef-a538-69770c7f5f56","Type":"ContainerStarted","Data":"5239960754d2fd00dfd7b04c1fea85935975940aa29a33bf0d98c0d934dab897"} Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.371021 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.371004499 podStartE2EDuration="2.371004499s" podCreationTimestamp="2025-11-27 11:29:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:29:30.368984246 +0000 UTC m=+1211.468482454" watchObservedRunningTime="2025-11-27 11:29:30.371004499 +0000 UTC m=+1211.470502697" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.373429 4807 scope.go:117] "RemoveContainer" containerID="b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.393779 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.397450 4807 scope.go:117] "RemoveContainer" containerID="94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4" Nov 27 11:29:30 crc kubenswrapper[4807]: E1127 11:29:30.397970 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4\": container with ID starting with 94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4 not found: ID does not exist" containerID="94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.398034 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4"} err="failed to get container status \"94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4\": rpc error: code = NotFound desc = could not find container \"94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4\": container with ID starting with 94d58272ad2d23bff2c454b1bfd3bf257a526c6978c66c8739fc968ef6cd51e4 not found: ID does not exist" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.398069 4807 scope.go:117] "RemoveContainer" containerID="b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60" Nov 27 11:29:30 crc kubenswrapper[4807]: E1127 11:29:30.401369 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60\": container with ID starting with b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60 not found: ID does not exist" containerID="b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.401405 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60"} err="failed to get container status \"b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60\": rpc error: code = NotFound desc = could not find container \"b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60\": container with ID starting with b66b568458143781102ecb88f209fc64366e2746f22ddf55ffe26193f2692e60 not found: ID does not exist" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.412626 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.449881 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:29:30 crc kubenswrapper[4807]: E1127 11:29:30.450542 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-metadata" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.454548 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-metadata" Nov 27 11:29:30 crc kubenswrapper[4807]: E1127 11:29:30.454642 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-log" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.454656 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-log" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.455145 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-metadata" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.455169 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" containerName="nova-metadata-log" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.456729 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.461646 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.463386 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.463618 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.620749 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/088b7a89-396a-434c-b201-a7ecb96cb2e7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.620860 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/088b7a89-396a-434c-b201-a7ecb96cb2e7-config-data\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.620914 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/088b7a89-396a-434c-b201-a7ecb96cb2e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.620951 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l65j\" (UniqueName: \"kubernetes.io/projected/088b7a89-396a-434c-b201-a7ecb96cb2e7-kube-api-access-9l65j\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.620988 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/088b7a89-396a-434c-b201-a7ecb96cb2e7-logs\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.722741 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/088b7a89-396a-434c-b201-a7ecb96cb2e7-config-data\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.722867 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/088b7a89-396a-434c-b201-a7ecb96cb2e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.722931 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l65j\" (UniqueName: \"kubernetes.io/projected/088b7a89-396a-434c-b201-a7ecb96cb2e7-kube-api-access-9l65j\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.722999 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/088b7a89-396a-434c-b201-a7ecb96cb2e7-logs\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.723231 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/088b7a89-396a-434c-b201-a7ecb96cb2e7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.723520 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/088b7a89-396a-434c-b201-a7ecb96cb2e7-logs\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.726430 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/088b7a89-396a-434c-b201-a7ecb96cb2e7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.727148 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/088b7a89-396a-434c-b201-a7ecb96cb2e7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.727832 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/088b7a89-396a-434c-b201-a7ecb96cb2e7-config-data\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.742090 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l65j\" (UniqueName: \"kubernetes.io/projected/088b7a89-396a-434c-b201-a7ecb96cb2e7-kube-api-access-9l65j\") pod \"nova-metadata-0\" (UID: \"088b7a89-396a-434c-b201-a7ecb96cb2e7\") " pod="openstack/nova-metadata-0" Nov 27 11:29:30 crc kubenswrapper[4807]: I1127 11:29:30.781877 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 27 11:29:31 crc kubenswrapper[4807]: I1127 11:29:31.222156 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 27 11:29:31 crc kubenswrapper[4807]: W1127 11:29:31.223939 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod088b7a89_396a_434c_b201_a7ecb96cb2e7.slice/crio-1aedd0d880301f091ae8d1862625eab8615de98ac71b9f7f9c98fda9db37b6a2 WatchSource:0}: Error finding container 1aedd0d880301f091ae8d1862625eab8615de98ac71b9f7f9c98fda9db37b6a2: Status 404 returned error can't find the container with id 1aedd0d880301f091ae8d1862625eab8615de98ac71b9f7f9c98fda9db37b6a2 Nov 27 11:29:31 crc kubenswrapper[4807]: I1127 11:29:31.360561 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"088b7a89-396a-434c-b201-a7ecb96cb2e7","Type":"ContainerStarted","Data":"1aedd0d880301f091ae8d1862625eab8615de98ac71b9f7f9c98fda9db37b6a2"} Nov 27 11:29:31 crc kubenswrapper[4807]: I1127 11:29:31.545093 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4" path="/var/lib/kubelet/pods/80af84d5-da7b-4b8f-b7ab-1eef6e66e3e4/volumes" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.079776 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.251041 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-config-data\") pod \"d8515dce-7827-4661-ad4b-ca66a948eeeb\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.251094 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-combined-ca-bundle\") pod \"d8515dce-7827-4661-ad4b-ca66a948eeeb\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.251159 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l84z\" (UniqueName: \"kubernetes.io/projected/d8515dce-7827-4661-ad4b-ca66a948eeeb-kube-api-access-5l84z\") pod \"d8515dce-7827-4661-ad4b-ca66a948eeeb\" (UID: \"d8515dce-7827-4661-ad4b-ca66a948eeeb\") " Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.256795 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8515dce-7827-4661-ad4b-ca66a948eeeb-kube-api-access-5l84z" (OuterVolumeSpecName: "kube-api-access-5l84z") pod "d8515dce-7827-4661-ad4b-ca66a948eeeb" (UID: "d8515dce-7827-4661-ad4b-ca66a948eeeb"). InnerVolumeSpecName "kube-api-access-5l84z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.279753 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8515dce-7827-4661-ad4b-ca66a948eeeb" (UID: "d8515dce-7827-4661-ad4b-ca66a948eeeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.282505 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-config-data" (OuterVolumeSpecName: "config-data") pod "d8515dce-7827-4661-ad4b-ca66a948eeeb" (UID: "d8515dce-7827-4661-ad4b-ca66a948eeeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.353036 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.353068 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8515dce-7827-4661-ad4b-ca66a948eeeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.353077 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l84z\" (UniqueName: \"kubernetes.io/projected/d8515dce-7827-4661-ad4b-ca66a948eeeb-kube-api-access-5l84z\") on node \"crc\" DevicePath \"\"" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.373948 4807 generic.go:334] "Generic (PLEG): container finished" podID="d8515dce-7827-4661-ad4b-ca66a948eeeb" containerID="ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13" exitCode=0 Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.374013 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.374034 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d8515dce-7827-4661-ad4b-ca66a948eeeb","Type":"ContainerDied","Data":"ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13"} Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.374095 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d8515dce-7827-4661-ad4b-ca66a948eeeb","Type":"ContainerDied","Data":"fdc232336fb2c2f29bfb981c97aa8c0e86c4086a0140b688b6c51304bfb269df"} Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.374115 4807 scope.go:117] "RemoveContainer" containerID="ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.376871 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"088b7a89-396a-434c-b201-a7ecb96cb2e7","Type":"ContainerStarted","Data":"5b9fce06b66611944aa8c390435059688fc223ce7ff309ed4d141e06ea86db53"} Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.376898 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"088b7a89-396a-434c-b201-a7ecb96cb2e7","Type":"ContainerStarted","Data":"57285e787307207fda3dd98606df7e11d968dc6ddd7aa7e530810c75e4b1fe12"} Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.398781 4807 scope.go:117] "RemoveContainer" containerID="ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13" Nov 27 11:29:32 crc kubenswrapper[4807]: E1127 11:29:32.399435 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13\": container with ID starting with ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13 not found: ID does not exist" containerID="ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.399484 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13"} err="failed to get container status \"ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13\": rpc error: code = NotFound desc = could not find container \"ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13\": container with ID starting with ab61174e07bbf8a014fe6bda2ebaeea6c9db1d1435294253b711db27a545da13 not found: ID does not exist" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.409827 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.409805479 podStartE2EDuration="2.409805479s" podCreationTimestamp="2025-11-27 11:29:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:29:32.396269656 +0000 UTC m=+1213.495767874" watchObservedRunningTime="2025-11-27 11:29:32.409805479 +0000 UTC m=+1213.509303677" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.424101 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.440187 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.449683 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:29:32 crc kubenswrapper[4807]: E1127 11:29:32.450447 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8515dce-7827-4661-ad4b-ca66a948eeeb" containerName="nova-scheduler-scheduler" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.450496 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8515dce-7827-4661-ad4b-ca66a948eeeb" containerName="nova-scheduler-scheduler" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.450910 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8515dce-7827-4661-ad4b-ca66a948eeeb" containerName="nova-scheduler-scheduler" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.452076 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.465000 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.468573 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.556484 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c3409e-49b1-4dfd-ba16-005e9f6e5a44-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"39c3409e-49b1-4dfd-ba16-005e9f6e5a44\") " pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.556559 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlkz9\" (UniqueName: \"kubernetes.io/projected/39c3409e-49b1-4dfd-ba16-005e9f6e5a44-kube-api-access-rlkz9\") pod \"nova-scheduler-0\" (UID: \"39c3409e-49b1-4dfd-ba16-005e9f6e5a44\") " pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.556716 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c3409e-49b1-4dfd-ba16-005e9f6e5a44-config-data\") pod \"nova-scheduler-0\" (UID: \"39c3409e-49b1-4dfd-ba16-005e9f6e5a44\") " pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.659357 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c3409e-49b1-4dfd-ba16-005e9f6e5a44-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"39c3409e-49b1-4dfd-ba16-005e9f6e5a44\") " pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.659416 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlkz9\" (UniqueName: \"kubernetes.io/projected/39c3409e-49b1-4dfd-ba16-005e9f6e5a44-kube-api-access-rlkz9\") pod \"nova-scheduler-0\" (UID: \"39c3409e-49b1-4dfd-ba16-005e9f6e5a44\") " pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.659770 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c3409e-49b1-4dfd-ba16-005e9f6e5a44-config-data\") pod \"nova-scheduler-0\" (UID: \"39c3409e-49b1-4dfd-ba16-005e9f6e5a44\") " pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.662921 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39c3409e-49b1-4dfd-ba16-005e9f6e5a44-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"39c3409e-49b1-4dfd-ba16-005e9f6e5a44\") " pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.663455 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39c3409e-49b1-4dfd-ba16-005e9f6e5a44-config-data\") pod \"nova-scheduler-0\" (UID: \"39c3409e-49b1-4dfd-ba16-005e9f6e5a44\") " pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.677474 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlkz9\" (UniqueName: \"kubernetes.io/projected/39c3409e-49b1-4dfd-ba16-005e9f6e5a44-kube-api-access-rlkz9\") pod \"nova-scheduler-0\" (UID: \"39c3409e-49b1-4dfd-ba16-005e9f6e5a44\") " pod="openstack/nova-scheduler-0" Nov 27 11:29:32 crc kubenswrapper[4807]: I1127 11:29:32.768993 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 27 11:29:33 crc kubenswrapper[4807]: W1127 11:29:33.181106 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39c3409e_49b1_4dfd_ba16_005e9f6e5a44.slice/crio-a79910b547db9419e1ae86c12445cc4ed056d55e304f86355d455dec48e7772c WatchSource:0}: Error finding container a79910b547db9419e1ae86c12445cc4ed056d55e304f86355d455dec48e7772c: Status 404 returned error can't find the container with id a79910b547db9419e1ae86c12445cc4ed056d55e304f86355d455dec48e7772c Nov 27 11:29:33 crc kubenswrapper[4807]: I1127 11:29:33.181588 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 27 11:29:33 crc kubenswrapper[4807]: I1127 11:29:33.392741 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"39c3409e-49b1-4dfd-ba16-005e9f6e5a44","Type":"ContainerStarted","Data":"ad05248ba725a04d627afe79d610f6b038937091e55d75669fe010cff981c1bb"} Nov 27 11:29:33 crc kubenswrapper[4807]: I1127 11:29:33.393157 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"39c3409e-49b1-4dfd-ba16-005e9f6e5a44","Type":"ContainerStarted","Data":"a79910b547db9419e1ae86c12445cc4ed056d55e304f86355d455dec48e7772c"} Nov 27 11:29:33 crc kubenswrapper[4807]: I1127 11:29:33.408222 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.408201571 podStartE2EDuration="1.408201571s" podCreationTimestamp="2025-11-27 11:29:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:29:33.407747479 +0000 UTC m=+1214.507245717" watchObservedRunningTime="2025-11-27 11:29:33.408201571 +0000 UTC m=+1214.507699769" Nov 27 11:29:33 crc kubenswrapper[4807]: I1127 11:29:33.569645 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8515dce-7827-4661-ad4b-ca66a948eeeb" path="/var/lib/kubelet/pods/d8515dce-7827-4661-ad4b-ca66a948eeeb/volumes" Nov 27 11:29:35 crc kubenswrapper[4807]: I1127 11:29:35.782167 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 11:29:35 crc kubenswrapper[4807]: I1127 11:29:35.782456 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 27 11:29:37 crc kubenswrapper[4807]: I1127 11:29:37.769344 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 27 11:29:38 crc kubenswrapper[4807]: I1127 11:29:38.766386 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 11:29:38 crc kubenswrapper[4807]: I1127 11:29:38.766697 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 27 11:29:39 crc kubenswrapper[4807]: I1127 11:29:39.783443 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a071e484-2dfb-4bef-a538-69770c7f5f56" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 11:29:39 crc kubenswrapper[4807]: I1127 11:29:39.783556 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a071e484-2dfb-4bef-a538-69770c7f5f56" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 11:29:40 crc kubenswrapper[4807]: I1127 11:29:40.782999 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 11:29:40 crc kubenswrapper[4807]: I1127 11:29:40.783037 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 27 11:29:41 crc kubenswrapper[4807]: I1127 11:29:41.795437 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="088b7a89-396a-434c-b201-a7ecb96cb2e7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 27 11:29:41 crc kubenswrapper[4807]: I1127 11:29:41.795466 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="088b7a89-396a-434c-b201-a7ecb96cb2e7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 27 11:29:42 crc kubenswrapper[4807]: I1127 11:29:42.776581 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 27 11:29:42 crc kubenswrapper[4807]: I1127 11:29:42.800381 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 27 11:29:43 crc kubenswrapper[4807]: I1127 11:29:43.510429 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 27 11:29:45 crc kubenswrapper[4807]: I1127 11:29:45.567602 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 27 11:29:48 crc kubenswrapper[4807]: I1127 11:29:48.773393 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 11:29:48 crc kubenswrapper[4807]: I1127 11:29:48.774065 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 11:29:48 crc kubenswrapper[4807]: I1127 11:29:48.776807 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 27 11:29:48 crc kubenswrapper[4807]: I1127 11:29:48.784738 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 11:29:49 crc kubenswrapper[4807]: I1127 11:29:49.544569 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 27 11:29:49 crc kubenswrapper[4807]: I1127 11:29:49.546310 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 27 11:29:50 crc kubenswrapper[4807]: I1127 11:29:50.788307 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 11:29:50 crc kubenswrapper[4807]: I1127 11:29:50.790664 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 27 11:29:50 crc kubenswrapper[4807]: I1127 11:29:50.794765 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 11:29:50 crc kubenswrapper[4807]: I1127 11:29:50.921425 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:29:50 crc kubenswrapper[4807]: I1127 11:29:50.921491 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:29:51 crc kubenswrapper[4807]: I1127 11:29:51.559893 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 27 11:29:59 crc kubenswrapper[4807]: I1127 11:29:59.339291 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.076289 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.155151 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7"] Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.156626 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.159657 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.159856 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.189324 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7"] Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.190255 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4550963-890e-4525-901a-e9b520618ff8-secret-volume\") pod \"collect-profiles-29404050-9llx7\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.190317 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw9pb\" (UniqueName: \"kubernetes.io/projected/b4550963-890e-4525-901a-e9b520618ff8-kube-api-access-nw9pb\") pod \"collect-profiles-29404050-9llx7\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.190426 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4550963-890e-4525-901a-e9b520618ff8-config-volume\") pod \"collect-profiles-29404050-9llx7\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.294255 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4550963-890e-4525-901a-e9b520618ff8-secret-volume\") pod \"collect-profiles-29404050-9llx7\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.294324 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw9pb\" (UniqueName: \"kubernetes.io/projected/b4550963-890e-4525-901a-e9b520618ff8-kube-api-access-nw9pb\") pod \"collect-profiles-29404050-9llx7\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.294424 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4550963-890e-4525-901a-e9b520618ff8-config-volume\") pod \"collect-profiles-29404050-9llx7\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.298934 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4550963-890e-4525-901a-e9b520618ff8-config-volume\") pod \"collect-profiles-29404050-9llx7\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.302843 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4550963-890e-4525-901a-e9b520618ff8-secret-volume\") pod \"collect-profiles-29404050-9llx7\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.329188 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw9pb\" (UniqueName: \"kubernetes.io/projected/b4550963-890e-4525-901a-e9b520618ff8-kube-api-access-nw9pb\") pod \"collect-profiles-29404050-9llx7\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:00 crc kubenswrapper[4807]: I1127 11:30:00.474497 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:01 crc kubenswrapper[4807]: I1127 11:30:01.007286 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7"] Nov 27 11:30:01 crc kubenswrapper[4807]: I1127 11:30:01.673120 4807 generic.go:334] "Generic (PLEG): container finished" podID="b4550963-890e-4525-901a-e9b520618ff8" containerID="d365fdc122d406c370af34c79f0b84f63526ae27c969d680055a446b6a3b4a56" exitCode=0 Nov 27 11:30:01 crc kubenswrapper[4807]: I1127 11:30:01.673170 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" event={"ID":"b4550963-890e-4525-901a-e9b520618ff8","Type":"ContainerDied","Data":"d365fdc122d406c370af34c79f0b84f63526ae27c969d680055a446b6a3b4a56"} Nov 27 11:30:01 crc kubenswrapper[4807]: I1127 11:30:01.673457 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" event={"ID":"b4550963-890e-4525-901a-e9b520618ff8","Type":"ContainerStarted","Data":"ad39211a9f93afbcef5ec04195b0d279cffbc210edd39aacab84027669f7b2bf"} Nov 27 11:30:02 crc kubenswrapper[4807]: I1127 11:30:02.975855 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.043519 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4550963-890e-4525-901a-e9b520618ff8-secret-volume\") pod \"b4550963-890e-4525-901a-e9b520618ff8\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.043753 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4550963-890e-4525-901a-e9b520618ff8-config-volume\") pod \"b4550963-890e-4525-901a-e9b520618ff8\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.043846 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw9pb\" (UniqueName: \"kubernetes.io/projected/b4550963-890e-4525-901a-e9b520618ff8-kube-api-access-nw9pb\") pod \"b4550963-890e-4525-901a-e9b520618ff8\" (UID: \"b4550963-890e-4525-901a-e9b520618ff8\") " Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.044271 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4550963-890e-4525-901a-e9b520618ff8-config-volume" (OuterVolumeSpecName: "config-volume") pod "b4550963-890e-4525-901a-e9b520618ff8" (UID: "b4550963-890e-4525-901a-e9b520618ff8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.044696 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4550963-890e-4525-901a-e9b520618ff8-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.062764 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4550963-890e-4525-901a-e9b520618ff8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b4550963-890e-4525-901a-e9b520618ff8" (UID: "b4550963-890e-4525-901a-e9b520618ff8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.062917 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4550963-890e-4525-901a-e9b520618ff8-kube-api-access-nw9pb" (OuterVolumeSpecName: "kube-api-access-nw9pb") pod "b4550963-890e-4525-901a-e9b520618ff8" (UID: "b4550963-890e-4525-901a-e9b520618ff8"). InnerVolumeSpecName "kube-api-access-nw9pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.146863 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b4550963-890e-4525-901a-e9b520618ff8-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.146895 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw9pb\" (UniqueName: \"kubernetes.io/projected/b4550963-890e-4525-901a-e9b520618ff8-kube-api-access-nw9pb\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.691957 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" event={"ID":"b4550963-890e-4525-901a-e9b520618ff8","Type":"ContainerDied","Data":"ad39211a9f93afbcef5ec04195b0d279cffbc210edd39aacab84027669f7b2bf"} Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.692317 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad39211a9f93afbcef5ec04195b0d279cffbc210edd39aacab84027669f7b2bf" Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.691983 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7" Nov 27 11:30:03 crc kubenswrapper[4807]: I1127 11:30:03.781971 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e153e04c-cadb-4d8a-9863-9ef60eac08e9" containerName="rabbitmq" containerID="cri-o://d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c" gracePeriod=604796 Nov 27 11:30:04 crc kubenswrapper[4807]: I1127 11:30:04.778432 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b811158c-3b16-415b-95df-baba9483d782" containerName="rabbitmq" containerID="cri-o://c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212" gracePeriod=604796 Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.315030 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382033 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382105 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-confd\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382164 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-erlang-cookie\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382202 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-plugins\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382226 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-plugins-conf\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382367 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gnwm\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-kube-api-access-7gnwm\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382409 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e153e04c-cadb-4d8a-9863-9ef60eac08e9-erlang-cookie-secret\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382475 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e153e04c-cadb-4d8a-9863-9ef60eac08e9-pod-info\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382512 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-tls\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382603 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-server-conf\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.382631 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-config-data\") pod \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\" (UID: \"e153e04c-cadb-4d8a-9863-9ef60eac08e9\") " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.383100 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.383619 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.384238 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.392508 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e153e04c-cadb-4d8a-9863-9ef60eac08e9-pod-info" (OuterVolumeSpecName: "pod-info") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.393058 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e153e04c-cadb-4d8a-9863-9ef60eac08e9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.394591 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-kube-api-access-7gnwm" (OuterVolumeSpecName: "kube-api-access-7gnwm") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "kube-api-access-7gnwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.399816 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.406944 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.444805 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-config-data" (OuterVolumeSpecName: "config-data") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.455418 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-server-conf" (OuterVolumeSpecName: "server-conf") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.486581 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.486611 4807 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.486620 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gnwm\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-kube-api-access-7gnwm\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.486631 4807 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e153e04c-cadb-4d8a-9863-9ef60eac08e9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.486641 4807 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e153e04c-cadb-4d8a-9863-9ef60eac08e9-pod-info\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.486701 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.486710 4807 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-server-conf\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.486719 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e153e04c-cadb-4d8a-9863-9ef60eac08e9-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.486738 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.486747 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.509332 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.535099 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e153e04c-cadb-4d8a-9863-9ef60eac08e9" (UID: "e153e04c-cadb-4d8a-9863-9ef60eac08e9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.589546 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.589579 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e153e04c-cadb-4d8a-9863-9ef60eac08e9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.757913 4807 generic.go:334] "Generic (PLEG): container finished" podID="e153e04c-cadb-4d8a-9863-9ef60eac08e9" containerID="d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c" exitCode=0 Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.757956 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e153e04c-cadb-4d8a-9863-9ef60eac08e9","Type":"ContainerDied","Data":"d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c"} Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.757981 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e153e04c-cadb-4d8a-9863-9ef60eac08e9","Type":"ContainerDied","Data":"9c9cd4fa96b0df6ce90c09b402bb6e5b15f47ed9cc97efe2e769bea43d7e936a"} Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.757996 4807 scope.go:117] "RemoveContainer" containerID="d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.758101 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.791382 4807 scope.go:117] "RemoveContainer" containerID="83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.796989 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.805034 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.829167 4807 scope.go:117] "RemoveContainer" containerID="d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c" Nov 27 11:30:10 crc kubenswrapper[4807]: E1127 11:30:10.829726 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c\": container with ID starting with d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c not found: ID does not exist" containerID="d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.829764 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c"} err="failed to get container status \"d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c\": rpc error: code = NotFound desc = could not find container \"d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c\": container with ID starting with d32d2e57181385a869fe2e5229d795557297ae0cf042b50e58d543c67a8c113c not found: ID does not exist" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.829795 4807 scope.go:117] "RemoveContainer" containerID="83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5" Nov 27 11:30:10 crc kubenswrapper[4807]: E1127 11:30:10.830182 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5\": container with ID starting with 83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5 not found: ID does not exist" containerID="83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.830200 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5"} err="failed to get container status \"83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5\": rpc error: code = NotFound desc = could not find container \"83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5\": container with ID starting with 83333aee30e5bc4ccaa87405f238b79bd7e64ebbe703fe9bddfe4658d8041fd5 not found: ID does not exist" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.836320 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 11:30:10 crc kubenswrapper[4807]: E1127 11:30:10.836701 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e153e04c-cadb-4d8a-9863-9ef60eac08e9" containerName="rabbitmq" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.836714 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e153e04c-cadb-4d8a-9863-9ef60eac08e9" containerName="rabbitmq" Nov 27 11:30:10 crc kubenswrapper[4807]: E1127 11:30:10.836731 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e153e04c-cadb-4d8a-9863-9ef60eac08e9" containerName="setup-container" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.836740 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e153e04c-cadb-4d8a-9863-9ef60eac08e9" containerName="setup-container" Nov 27 11:30:10 crc kubenswrapper[4807]: E1127 11:30:10.836759 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4550963-890e-4525-901a-e9b520618ff8" containerName="collect-profiles" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.836765 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4550963-890e-4525-901a-e9b520618ff8" containerName="collect-profiles" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.836969 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4550963-890e-4525-901a-e9b520618ff8" containerName="collect-profiles" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.836982 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e153e04c-cadb-4d8a-9863-9ef60eac08e9" containerName="rabbitmq" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.840181 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.845942 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.846152 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.846848 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.846986 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.847088 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-tt85w" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.847183 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.847316 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.860990 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.894949 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2dc733a-0951-4580-a301-d0dd7d7937f1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.894986 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.895041 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2dc733a-0951-4580-a301-d0dd7d7937f1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.895068 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.895100 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2dc733a-0951-4580-a301-d0dd7d7937f1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.895121 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.895135 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.895184 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28h69\" (UniqueName: \"kubernetes.io/projected/c2dc733a-0951-4580-a301-d0dd7d7937f1-kube-api-access-28h69\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.895236 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2dc733a-0951-4580-a301-d0dd7d7937f1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.895268 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2dc733a-0951-4580-a301-d0dd7d7937f1-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.895285 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.997747 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2dc733a-0951-4580-a301-d0dd7d7937f1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.998642 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.998763 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2dc733a-0951-4580-a301-d0dd7d7937f1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.998801 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.998828 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.998909 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28h69\" (UniqueName: \"kubernetes.io/projected/c2dc733a-0951-4580-a301-d0dd7d7937f1-kube-api-access-28h69\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.998987 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2dc733a-0951-4580-a301-d0dd7d7937f1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.999014 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2dc733a-0951-4580-a301-d0dd7d7937f1-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.999037 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.999071 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2dc733a-0951-4580-a301-d0dd7d7937f1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.999092 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.999324 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.999563 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.999741 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Nov 27 11:30:10 crc kubenswrapper[4807]: I1127 11:30:10.999939 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2dc733a-0951-4580-a301-d0dd7d7937f1-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.001271 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2dc733a-0951-4580-a301-d0dd7d7937f1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.002180 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2dc733a-0951-4580-a301-d0dd7d7937f1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.002775 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.004070 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2dc733a-0951-4580-a301-d0dd7d7937f1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.004233 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2dc733a-0951-4580-a301-d0dd7d7937f1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.004756 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2dc733a-0951-4580-a301-d0dd7d7937f1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.019552 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28h69\" (UniqueName: \"kubernetes.io/projected/c2dc733a-0951-4580-a301-d0dd7d7937f1-kube-api-access-28h69\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.040320 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"c2dc733a-0951-4580-a301-d0dd7d7937f1\") " pod="openstack/rabbitmq-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.177749 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.361039 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.415658 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-confd\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.415721 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b811158c-3b16-415b-95df-baba9483d782-pod-info\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.415807 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn86n\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-kube-api-access-rn86n\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.415857 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b811158c-3b16-415b-95df-baba9483d782-erlang-cookie-secret\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.415890 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-plugins\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.415940 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-config-data\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.415989 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-server-conf\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.416039 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-erlang-cookie\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.416076 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-tls\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.416104 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.416147 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-plugins-conf\") pod \"b811158c-3b16-415b-95df-baba9483d782\" (UID: \"b811158c-3b16-415b-95df-baba9483d782\") " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.417267 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.418328 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.421959 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.431403 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.431523 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-kube-api-access-rn86n" (OuterVolumeSpecName: "kube-api-access-rn86n") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "kube-api-access-rn86n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.432116 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b811158c-3b16-415b-95df-baba9483d782-pod-info" (OuterVolumeSpecName: "pod-info") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.434565 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.439119 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b811158c-3b16-415b-95df-baba9483d782-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.473326 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-config-data" (OuterVolumeSpecName: "config-data") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.503610 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-server-conf" (OuterVolumeSpecName: "server-conf") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.519443 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn86n\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-kube-api-access-rn86n\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.519488 4807 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b811158c-3b16-415b-95df-baba9483d782-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.519532 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.519545 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.519557 4807 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-server-conf\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.519571 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.519636 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.519672 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.519703 4807 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b811158c-3b16-415b-95df-baba9483d782-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.519716 4807 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b811158c-3b16-415b-95df-baba9483d782-pod-info\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.548378 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e153e04c-cadb-4d8a-9863-9ef60eac08e9" path="/var/lib/kubelet/pods/e153e04c-cadb-4d8a-9863-9ef60eac08e9/volumes" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.555276 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.555708 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b811158c-3b16-415b-95df-baba9483d782" (UID: "b811158c-3b16-415b-95df-baba9483d782"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.621333 4807 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b811158c-3b16-415b-95df-baba9483d782-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.621365 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.645786 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.767236 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2dc733a-0951-4580-a301-d0dd7d7937f1","Type":"ContainerStarted","Data":"0214e4e074602ce8b56a59b9cc246de7906ab0ae1af990802719073ac08a5434"} Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.769400 4807 generic.go:334] "Generic (PLEG): container finished" podID="b811158c-3b16-415b-95df-baba9483d782" containerID="c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212" exitCode=0 Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.769426 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b811158c-3b16-415b-95df-baba9483d782","Type":"ContainerDied","Data":"c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212"} Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.769442 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b811158c-3b16-415b-95df-baba9483d782","Type":"ContainerDied","Data":"283f0d96b24f760cf6e323867815a3f05b30b2c0ef9be4d581634c55ea9127f6"} Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.769459 4807 scope.go:117] "RemoveContainer" containerID="c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.769558 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.808572 4807 scope.go:117] "RemoveContainer" containerID="9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.811093 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.826478 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.834713 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 11:30:11 crc kubenswrapper[4807]: E1127 11:30:11.835089 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b811158c-3b16-415b-95df-baba9483d782" containerName="setup-container" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.835107 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b811158c-3b16-415b-95df-baba9483d782" containerName="setup-container" Nov 27 11:30:11 crc kubenswrapper[4807]: E1127 11:30:11.835126 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b811158c-3b16-415b-95df-baba9483d782" containerName="rabbitmq" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.835133 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b811158c-3b16-415b-95df-baba9483d782" containerName="rabbitmq" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.835339 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b811158c-3b16-415b-95df-baba9483d782" containerName="rabbitmq" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.836232 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.838470 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.838655 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.838818 4807 scope.go:117] "RemoveContainer" containerID="c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212" Nov 27 11:30:11 crc kubenswrapper[4807]: E1127 11:30:11.839209 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212\": container with ID starting with c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212 not found: ID does not exist" containerID="c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.839327 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212"} err="failed to get container status \"c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212\": rpc error: code = NotFound desc = could not find container \"c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212\": container with ID starting with c48e2fc701d89281007affabfe5eea7db62794ac6386d9ad592957f4b6d3e212 not found: ID does not exist" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.839357 4807 scope.go:117] "RemoveContainer" containerID="9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b" Nov 27 11:30:11 crc kubenswrapper[4807]: E1127 11:30:11.839869 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b\": container with ID starting with 9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b not found: ID does not exist" containerID="9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.839908 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b"} err="failed to get container status \"9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b\": rpc error: code = NotFound desc = could not find container \"9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b\": container with ID starting with 9e6765c1329a0e3dbe2c7ebbe98178418e0634a6b096c810e4a057d37376b58b not found: ID does not exist" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.840882 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.841308 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f6jm6" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.841347 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.841389 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.841640 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.858749 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.927509 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.927554 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zts8\" (UniqueName: \"kubernetes.io/projected/c679115a-3605-4e24-8d75-553d53d87f48-kube-api-access-5zts8\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.927585 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c679115a-3605-4e24-8d75-553d53d87f48-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.927686 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.927721 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c679115a-3605-4e24-8d75-553d53d87f48-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.927742 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c679115a-3605-4e24-8d75-553d53d87f48-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.927763 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.927991 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.928037 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.928094 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c679115a-3605-4e24-8d75-553d53d87f48-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:11 crc kubenswrapper[4807]: I1127 11:30:11.928280 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c679115a-3605-4e24-8d75-553d53d87f48-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030520 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030575 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030611 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c679115a-3605-4e24-8d75-553d53d87f48-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030669 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c679115a-3605-4e24-8d75-553d53d87f48-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030692 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030711 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zts8\" (UniqueName: \"kubernetes.io/projected/c679115a-3605-4e24-8d75-553d53d87f48-kube-api-access-5zts8\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030740 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c679115a-3605-4e24-8d75-553d53d87f48-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030804 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030844 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c679115a-3605-4e24-8d75-553d53d87f48-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030867 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c679115a-3605-4e24-8d75-553d53d87f48-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030887 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.030887 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.031549 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.031771 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c679115a-3605-4e24-8d75-553d53d87f48-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.031808 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.031975 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c679115a-3605-4e24-8d75-553d53d87f48-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.032322 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c679115a-3605-4e24-8d75-553d53d87f48-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.036022 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.037855 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c679115a-3605-4e24-8d75-553d53d87f48-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.040752 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c679115a-3605-4e24-8d75-553d53d87f48-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.041932 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c679115a-3605-4e24-8d75-553d53d87f48-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.049510 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zts8\" (UniqueName: \"kubernetes.io/projected/c679115a-3605-4e24-8d75-553d53d87f48-kube-api-access-5zts8\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.074431 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c679115a-3605-4e24-8d75-553d53d87f48\") " pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.160443 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.619861 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 27 11:30:12 crc kubenswrapper[4807]: W1127 11:30:12.625533 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc679115a_3605_4e24_8d75_553d53d87f48.slice/crio-53f6f786905d0dcfca56645b290dfd78af41fec2fd881f3569b88bb79d1f9ea3 WatchSource:0}: Error finding container 53f6f786905d0dcfca56645b290dfd78af41fec2fd881f3569b88bb79d1f9ea3: Status 404 returned error can't find the container with id 53f6f786905d0dcfca56645b290dfd78af41fec2fd881f3569b88bb79d1f9ea3 Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.715236 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-wbpbx"] Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.717127 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.719079 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.745632 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-wbpbx"] Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.748758 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.748807 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.748847 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.748978 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.749211 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv9m9\" (UniqueName: \"kubernetes.io/projected/50533160-1f7c-454e-be40-ac9680bf6411-kube-api-access-qv9m9\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.749273 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.749499 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-config\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.791166 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c679115a-3605-4e24-8d75-553d53d87f48","Type":"ContainerStarted","Data":"53f6f786905d0dcfca56645b290dfd78af41fec2fd881f3569b88bb79d1f9ea3"} Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.851675 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-config\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.851758 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.851808 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.851844 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.851894 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.852011 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv9m9\" (UniqueName: \"kubernetes.io/projected/50533160-1f7c-454e-be40-ac9680bf6411-kube-api-access-qv9m9\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.852044 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.853665 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-config\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.853729 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.853766 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.853800 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.853819 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.854195 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.882566 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-wbpbx"] Nov 27 11:30:12 crc kubenswrapper[4807]: E1127 11:30:12.883355 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-qv9m9], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" podUID="50533160-1f7c-454e-be40-ac9680bf6411" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.913343 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-956t7"] Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.915617 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.921538 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv9m9\" (UniqueName: \"kubernetes.io/projected/50533160-1f7c-454e-be40-ac9680bf6411-kube-api-access-qv9m9\") pod \"dnsmasq-dns-79bd4cc8c9-wbpbx\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.922416 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-956t7"] Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.953807 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.953851 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs89d\" (UniqueName: \"kubernetes.io/projected/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-kube-api-access-fs89d\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.953906 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-config\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.953928 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.954121 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.954322 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-dns-svc\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:12 crc kubenswrapper[4807]: I1127 11:30:12.954388 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.056131 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.056403 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs89d\" (UniqueName: \"kubernetes.io/projected/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-kube-api-access-fs89d\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.056520 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-config\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.056598 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.056744 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.056895 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-dns-svc\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.057947 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.057717 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.057895 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-dns-svc\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.057665 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.058493 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-config\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.058631 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.058813 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.080304 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs89d\" (UniqueName: \"kubernetes.io/projected/6f7c0ed3-d807-4035-b4c9-a2f906d06c46-kube-api-access-fs89d\") pod \"dnsmasq-dns-55478c4467-956t7\" (UID: \"6f7c0ed3-d807-4035-b4c9-a2f906d06c46\") " pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.237087 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.569394 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b811158c-3b16-415b-95df-baba9483d782" path="/var/lib/kubelet/pods/b811158c-3b16-415b-95df-baba9483d782/volumes" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.735680 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-956t7"] Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.799988 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2dc733a-0951-4580-a301-d0dd7d7937f1","Type":"ContainerStarted","Data":"7b1b20fa47d10c248c211d42234364bad3a9cafffdde6b1292ff7f9e3a1c56a2"} Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.800007 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.813309 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.976106 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-svc\") pod \"50533160-1f7c-454e-be40-ac9680bf6411\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.976674 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-nb\") pod \"50533160-1f7c-454e-be40-ac9680bf6411\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.976829 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-swift-storage-0\") pod \"50533160-1f7c-454e-be40-ac9680bf6411\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.976986 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv9m9\" (UniqueName: \"kubernetes.io/projected/50533160-1f7c-454e-be40-ac9680bf6411-kube-api-access-qv9m9\") pod \"50533160-1f7c-454e-be40-ac9680bf6411\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.976613 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50533160-1f7c-454e-be40-ac9680bf6411" (UID: "50533160-1f7c-454e-be40-ac9680bf6411"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.977027 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "50533160-1f7c-454e-be40-ac9680bf6411" (UID: "50533160-1f7c-454e-be40-ac9680bf6411"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.977281 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "50533160-1f7c-454e-be40-ac9680bf6411" (UID: "50533160-1f7c-454e-be40-ac9680bf6411"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.977493 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-sb\") pod \"50533160-1f7c-454e-be40-ac9680bf6411\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.977879 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-config\") pod \"50533160-1f7c-454e-be40-ac9680bf6411\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.978010 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-openstack-edpm-ipam\") pod \"50533160-1f7c-454e-be40-ac9680bf6411\" (UID: \"50533160-1f7c-454e-be40-ac9680bf6411\") " Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.977902 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50533160-1f7c-454e-be40-ac9680bf6411" (UID: "50533160-1f7c-454e-be40-ac9680bf6411"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.978259 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-config" (OuterVolumeSpecName: "config") pod "50533160-1f7c-454e-be40-ac9680bf6411" (UID: "50533160-1f7c-454e-be40-ac9680bf6411"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.978700 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "50533160-1f7c-454e-be40-ac9680bf6411" (UID: "50533160-1f7c-454e-be40-ac9680bf6411"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.980632 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.982355 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.982374 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.982388 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.982413 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.982422 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50533160-1f7c-454e-be40-ac9680bf6411-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:13 crc kubenswrapper[4807]: I1127 11:30:13.982483 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50533160-1f7c-454e-be40-ac9680bf6411-kube-api-access-qv9m9" (OuterVolumeSpecName: "kube-api-access-qv9m9") pod "50533160-1f7c-454e-be40-ac9680bf6411" (UID: "50533160-1f7c-454e-be40-ac9680bf6411"). InnerVolumeSpecName "kube-api-access-qv9m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:30:14 crc kubenswrapper[4807]: I1127 11:30:14.083215 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv9m9\" (UniqueName: \"kubernetes.io/projected/50533160-1f7c-454e-be40-ac9680bf6411-kube-api-access-qv9m9\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:14 crc kubenswrapper[4807]: I1127 11:30:14.809626 4807 generic.go:334] "Generic (PLEG): container finished" podID="6f7c0ed3-d807-4035-b4c9-a2f906d06c46" containerID="8dcfd594e02fa6d3a994bd6e3178aad915a8fbf51d63878c32b16b30c698c20e" exitCode=0 Nov 27 11:30:14 crc kubenswrapper[4807]: I1127 11:30:14.809711 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-956t7" event={"ID":"6f7c0ed3-d807-4035-b4c9-a2f906d06c46","Type":"ContainerDied","Data":"8dcfd594e02fa6d3a994bd6e3178aad915a8fbf51d63878c32b16b30c698c20e"} Nov 27 11:30:14 crc kubenswrapper[4807]: I1127 11:30:14.809747 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-956t7" event={"ID":"6f7c0ed3-d807-4035-b4c9-a2f906d06c46","Type":"ContainerStarted","Data":"c83827ff37e02853a138d86e6ad258c2e9dba7aaf78afef06984e3fc43a3f024"} Nov 27 11:30:14 crc kubenswrapper[4807]: I1127 11:30:14.812299 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-wbpbx" Nov 27 11:30:14 crc kubenswrapper[4807]: I1127 11:30:14.812774 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c679115a-3605-4e24-8d75-553d53d87f48","Type":"ContainerStarted","Data":"70e75007d60def235c8e8f1dec9abf7cd79911b6ab242e28e0d742da8d67f750"} Nov 27 11:30:15 crc kubenswrapper[4807]: I1127 11:30:15.050417 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-wbpbx"] Nov 27 11:30:15 crc kubenswrapper[4807]: I1127 11:30:15.057117 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-wbpbx"] Nov 27 11:30:15 crc kubenswrapper[4807]: I1127 11:30:15.543927 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50533160-1f7c-454e-be40-ac9680bf6411" path="/var/lib/kubelet/pods/50533160-1f7c-454e-be40-ac9680bf6411/volumes" Nov 27 11:30:15 crc kubenswrapper[4807]: I1127 11:30:15.828405 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-956t7" event={"ID":"6f7c0ed3-d807-4035-b4c9-a2f906d06c46","Type":"ContainerStarted","Data":"7a46f27933755d8c2c4080e6f90ba585849a5381ad7c4684e79cf22e306e1fef"} Nov 27 11:30:16 crc kubenswrapper[4807]: I1127 11:30:16.841369 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:20 crc kubenswrapper[4807]: I1127 11:30:20.923233 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:30:20 crc kubenswrapper[4807]: I1127 11:30:20.924129 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:30:20 crc kubenswrapper[4807]: I1127 11:30:20.924202 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:30:20 crc kubenswrapper[4807]: I1127 11:30:20.925817 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"039fe769a3645e4feb5875769fb8911726da6d6d91da20c8da03dd3106e1c39e"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:30:20 crc kubenswrapper[4807]: I1127 11:30:20.925959 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://039fe769a3645e4feb5875769fb8911726da6d6d91da20c8da03dd3106e1c39e" gracePeriod=600 Nov 27 11:30:21 crc kubenswrapper[4807]: I1127 11:30:21.890354 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="039fe769a3645e4feb5875769fb8911726da6d6d91da20c8da03dd3106e1c39e" exitCode=0 Nov 27 11:30:21 crc kubenswrapper[4807]: I1127 11:30:21.890439 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"039fe769a3645e4feb5875769fb8911726da6d6d91da20c8da03dd3106e1c39e"} Nov 27 11:30:21 crc kubenswrapper[4807]: I1127 11:30:21.890939 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"c29d8c1f4aa1a4d993b85b54355b8275690d350f55ec0d06dbc91b3680f8a870"} Nov 27 11:30:21 crc kubenswrapper[4807]: I1127 11:30:21.890965 4807 scope.go:117] "RemoveContainer" containerID="f949ac50efe1cb33ac8f9f8fad96e486a8238fcb507e2fef2a39dd8e43ee4952" Nov 27 11:30:21 crc kubenswrapper[4807]: I1127 11:30:21.915671 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-956t7" podStartSLOduration=9.915649833 podStartE2EDuration="9.915649833s" podCreationTimestamp="2025-11-27 11:30:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:30:15.855551319 +0000 UTC m=+1256.955049517" watchObservedRunningTime="2025-11-27 11:30:21.915649833 +0000 UTC m=+1263.015148041" Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.239410 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-956t7" Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.296637 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-x7c26"] Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.298911 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" podUID="58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" containerName="dnsmasq-dns" containerID="cri-o://c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479" gracePeriod=10 Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.859060 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.912327 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.912327 4807 generic.go:334] "Generic (PLEG): container finished" podID="58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" containerID="c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479" exitCode=0 Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.912357 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" event={"ID":"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0","Type":"ContainerDied","Data":"c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479"} Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.912686 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-x7c26" event={"ID":"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0","Type":"ContainerDied","Data":"3616d5eb225e4bdd9d0304ddffc2b099f8c2f9803ab9ae4388a63a407a7f5808"} Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.912708 4807 scope.go:117] "RemoveContainer" containerID="c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479" Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.931433 4807 scope.go:117] "RemoveContainer" containerID="1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f" Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.950202 4807 scope.go:117] "RemoveContainer" containerID="c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479" Nov 27 11:30:23 crc kubenswrapper[4807]: E1127 11:30:23.950974 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479\": container with ID starting with c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479 not found: ID does not exist" containerID="c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479" Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.951022 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479"} err="failed to get container status \"c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479\": rpc error: code = NotFound desc = could not find container \"c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479\": container with ID starting with c9f7b52ff66e41ddd5c373e41ffdc8ec7d660078add95e78372e1d5765339479 not found: ID does not exist" Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.951044 4807 scope.go:117] "RemoveContainer" containerID="1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f" Nov 27 11:30:23 crc kubenswrapper[4807]: E1127 11:30:23.951447 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f\": container with ID starting with 1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f not found: ID does not exist" containerID="1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f" Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.951534 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f"} err="failed to get container status \"1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f\": rpc error: code = NotFound desc = could not find container \"1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f\": container with ID starting with 1aa9c12c12f88c896c38a8c3a8bf25f42a798ffb781f525fc1b363b8f328427f not found: ID does not exist" Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.980202 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-nb\") pod \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.980528 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-sb\") pod \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.980717 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-swift-storage-0\") pod \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.980840 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-config\") pod \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.980975 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-svc\") pod \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.981114 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcxbj\" (UniqueName: \"kubernetes.io/projected/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-kube-api-access-jcxbj\") pod \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\" (UID: \"58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0\") " Nov 27 11:30:23 crc kubenswrapper[4807]: I1127 11:30:23.987028 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-kube-api-access-jcxbj" (OuterVolumeSpecName: "kube-api-access-jcxbj") pod "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" (UID: "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0"). InnerVolumeSpecName "kube-api-access-jcxbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.030206 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" (UID: "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.030217 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-config" (OuterVolumeSpecName: "config") pod "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" (UID: "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.042177 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" (UID: "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.049159 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" (UID: "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.050115 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" (UID: "58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.083362 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcxbj\" (UniqueName: \"kubernetes.io/projected/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-kube-api-access-jcxbj\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.083397 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.083411 4807 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.083423 4807 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.083436 4807 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-config\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.083448 4807 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.244050 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-x7c26"] Nov 27 11:30:24 crc kubenswrapper[4807]: I1127 11:30:24.253368 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-x7c26"] Nov 27 11:30:25 crc kubenswrapper[4807]: I1127 11:30:25.552336 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" path="/var/lib/kubelet/pods/58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0/volumes" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.299142 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh"] Nov 27 11:30:32 crc kubenswrapper[4807]: E1127 11:30:32.299864 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" containerName="dnsmasq-dns" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.299878 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" containerName="dnsmasq-dns" Nov 27 11:30:32 crc kubenswrapper[4807]: E1127 11:30:32.299893 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" containerName="init" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.299899 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" containerName="init" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.300086 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a3e38f-7973-4aaf-8c5b-2f5f85dc1cc0" containerName="dnsmasq-dns" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.300641 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.309411 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.309828 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.310005 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.310079 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.332208 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7tpl\" (UniqueName: \"kubernetes.io/projected/c530987e-af49-45dc-ae6e-13c19df75606-kube-api-access-s7tpl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.332283 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.332325 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.332622 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.332786 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh"] Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.434022 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.434077 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.434191 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.434237 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7tpl\" (UniqueName: \"kubernetes.io/projected/c530987e-af49-45dc-ae6e-13c19df75606-kube-api-access-s7tpl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.440824 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.440899 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.440961 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.451149 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7tpl\" (UniqueName: \"kubernetes.io/projected/c530987e-af49-45dc-ae6e-13c19df75606-kube-api-access-s7tpl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:32 crc kubenswrapper[4807]: I1127 11:30:32.632232 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:33 crc kubenswrapper[4807]: I1127 11:30:33.185987 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh"] Nov 27 11:30:34 crc kubenswrapper[4807]: I1127 11:30:34.005823 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" event={"ID":"c530987e-af49-45dc-ae6e-13c19df75606","Type":"ContainerStarted","Data":"7218e6cc11a88d6864a1a959e40f1318127120bd5303518e358c2deb12da2f4b"} Nov 27 11:30:42 crc kubenswrapper[4807]: I1127 11:30:42.091881 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" event={"ID":"c530987e-af49-45dc-ae6e-13c19df75606","Type":"ContainerStarted","Data":"94210d58e4e8fea089523ffd3b7f60211fd9bdea52bf737f1d0e825821595976"} Nov 27 11:30:46 crc kubenswrapper[4807]: I1127 11:30:46.147170 4807 generic.go:334] "Generic (PLEG): container finished" podID="c2dc733a-0951-4580-a301-d0dd7d7937f1" containerID="7b1b20fa47d10c248c211d42234364bad3a9cafffdde6b1292ff7f9e3a1c56a2" exitCode=0 Nov 27 11:30:46 crc kubenswrapper[4807]: I1127 11:30:46.147273 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2dc733a-0951-4580-a301-d0dd7d7937f1","Type":"ContainerDied","Data":"7b1b20fa47d10c248c211d42234364bad3a9cafffdde6b1292ff7f9e3a1c56a2"} Nov 27 11:30:46 crc kubenswrapper[4807]: I1127 11:30:46.173172 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" podStartSLOduration=5.895387538 podStartE2EDuration="14.173151265s" podCreationTimestamp="2025-11-27 11:30:32 +0000 UTC" firstStartedPulling="2025-11-27 11:30:33.191322494 +0000 UTC m=+1274.290820692" lastFinishedPulling="2025-11-27 11:30:41.469086221 +0000 UTC m=+1282.568584419" observedRunningTime="2025-11-27 11:30:42.111997383 +0000 UTC m=+1283.211495581" watchObservedRunningTime="2025-11-27 11:30:46.173151265 +0000 UTC m=+1287.272649473" Nov 27 11:30:47 crc kubenswrapper[4807]: I1127 11:30:47.162218 4807 generic.go:334] "Generic (PLEG): container finished" podID="c679115a-3605-4e24-8d75-553d53d87f48" containerID="70e75007d60def235c8e8f1dec9abf7cd79911b6ab242e28e0d742da8d67f750" exitCode=0 Nov 27 11:30:47 crc kubenswrapper[4807]: I1127 11:30:47.162447 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c679115a-3605-4e24-8d75-553d53d87f48","Type":"ContainerDied","Data":"70e75007d60def235c8e8f1dec9abf7cd79911b6ab242e28e0d742da8d67f750"} Nov 27 11:30:47 crc kubenswrapper[4807]: I1127 11:30:47.165839 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2dc733a-0951-4580-a301-d0dd7d7937f1","Type":"ContainerStarted","Data":"fb0d0ae657581003cb707c5b91d7572405ed0256fcdf35b277744d76da56a630"} Nov 27 11:30:47 crc kubenswrapper[4807]: I1127 11:30:47.166040 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 27 11:30:47 crc kubenswrapper[4807]: I1127 11:30:47.213275 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.213234007 podStartE2EDuration="37.213234007s" podCreationTimestamp="2025-11-27 11:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:30:47.212816486 +0000 UTC m=+1288.312314744" watchObservedRunningTime="2025-11-27 11:30:47.213234007 +0000 UTC m=+1288.312732205" Nov 27 11:30:48 crc kubenswrapper[4807]: I1127 11:30:48.176588 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c679115a-3605-4e24-8d75-553d53d87f48","Type":"ContainerStarted","Data":"6eb204a50db63b2f81575a6f74e3c2e408ebb15bb8e11fcdc2d0836f77b9439c"} Nov 27 11:30:48 crc kubenswrapper[4807]: I1127 11:30:48.177444 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:30:48 crc kubenswrapper[4807]: I1127 11:30:48.203299 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.203281048 podStartE2EDuration="37.203281048s" podCreationTimestamp="2025-11-27 11:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:30:48.19640431 +0000 UTC m=+1289.295902528" watchObservedRunningTime="2025-11-27 11:30:48.203281048 +0000 UTC m=+1289.302779246" Nov 27 11:30:53 crc kubenswrapper[4807]: I1127 11:30:53.229583 4807 generic.go:334] "Generic (PLEG): container finished" podID="c530987e-af49-45dc-ae6e-13c19df75606" containerID="94210d58e4e8fea089523ffd3b7f60211fd9bdea52bf737f1d0e825821595976" exitCode=0 Nov 27 11:30:53 crc kubenswrapper[4807]: I1127 11:30:53.229642 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" event={"ID":"c530987e-af49-45dc-ae6e-13c19df75606","Type":"ContainerDied","Data":"94210d58e4e8fea089523ffd3b7f60211fd9bdea52bf737f1d0e825821595976"} Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.683052 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.744874 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-inventory\") pod \"c530987e-af49-45dc-ae6e-13c19df75606\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.744980 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-ssh-key\") pod \"c530987e-af49-45dc-ae6e-13c19df75606\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.745065 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-repo-setup-combined-ca-bundle\") pod \"c530987e-af49-45dc-ae6e-13c19df75606\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.745091 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7tpl\" (UniqueName: \"kubernetes.io/projected/c530987e-af49-45dc-ae6e-13c19df75606-kube-api-access-s7tpl\") pod \"c530987e-af49-45dc-ae6e-13c19df75606\" (UID: \"c530987e-af49-45dc-ae6e-13c19df75606\") " Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.753314 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c530987e-af49-45dc-ae6e-13c19df75606" (UID: "c530987e-af49-45dc-ae6e-13c19df75606"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.758638 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c530987e-af49-45dc-ae6e-13c19df75606-kube-api-access-s7tpl" (OuterVolumeSpecName: "kube-api-access-s7tpl") pod "c530987e-af49-45dc-ae6e-13c19df75606" (UID: "c530987e-af49-45dc-ae6e-13c19df75606"). InnerVolumeSpecName "kube-api-access-s7tpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.777334 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c530987e-af49-45dc-ae6e-13c19df75606" (UID: "c530987e-af49-45dc-ae6e-13c19df75606"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.779933 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-inventory" (OuterVolumeSpecName: "inventory") pod "c530987e-af49-45dc-ae6e-13c19df75606" (UID: "c530987e-af49-45dc-ae6e-13c19df75606"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.846921 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.846950 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.846960 4807 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c530987e-af49-45dc-ae6e-13c19df75606-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:54 crc kubenswrapper[4807]: I1127 11:30:54.846972 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7tpl\" (UniqueName: \"kubernetes.io/projected/c530987e-af49-45dc-ae6e-13c19df75606-kube-api-access-s7tpl\") on node \"crc\" DevicePath \"\"" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.255840 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" event={"ID":"c530987e-af49-45dc-ae6e-13c19df75606","Type":"ContainerDied","Data":"7218e6cc11a88d6864a1a959e40f1318127120bd5303518e358c2deb12da2f4b"} Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.256145 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7218e6cc11a88d6864a1a959e40f1318127120bd5303518e358c2deb12da2f4b" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.255942 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.340131 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h"] Nov 27 11:30:55 crc kubenswrapper[4807]: E1127 11:30:55.340680 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c530987e-af49-45dc-ae6e-13c19df75606" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.340705 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="c530987e-af49-45dc-ae6e-13c19df75606" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.340896 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="c530987e-af49-45dc-ae6e-13c19df75606" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.341462 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.343527 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.343891 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.344232 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.344753 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.359277 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h"] Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.457663 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xwp2h\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.458010 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kww7b\" (UniqueName: \"kubernetes.io/projected/edca8731-6d7e-44e5-b2a3-8622578409df-kube-api-access-kww7b\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xwp2h\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.458462 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xwp2h\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.559910 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xwp2h\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.560001 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xwp2h\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.560058 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kww7b\" (UniqueName: \"kubernetes.io/projected/edca8731-6d7e-44e5-b2a3-8622578409df-kube-api-access-kww7b\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xwp2h\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.563159 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xwp2h\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.567401 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xwp2h\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.578554 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kww7b\" (UniqueName: \"kubernetes.io/projected/edca8731-6d7e-44e5-b2a3-8622578409df-kube-api-access-kww7b\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-xwp2h\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:55 crc kubenswrapper[4807]: I1127 11:30:55.674234 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:30:56 crc kubenswrapper[4807]: I1127 11:30:56.234127 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h"] Nov 27 11:30:56 crc kubenswrapper[4807]: I1127 11:30:56.265400 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" event={"ID":"edca8731-6d7e-44e5-b2a3-8622578409df","Type":"ContainerStarted","Data":"77606fd8f5f36a8345a94094f7f58074356d4925fa9d11aab979e8d2cdc32374"} Nov 27 11:30:57 crc kubenswrapper[4807]: I1127 11:30:57.281440 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" event={"ID":"edca8731-6d7e-44e5-b2a3-8622578409df","Type":"ContainerStarted","Data":"c64131508d7195d506a9f78dfc35cce8b06e6396a1973887b05af8764f9ee91e"} Nov 27 11:30:57 crc kubenswrapper[4807]: I1127 11:30:57.302316 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" podStartSLOduration=1.857600788 podStartE2EDuration="2.302294225s" podCreationTimestamp="2025-11-27 11:30:55 +0000 UTC" firstStartedPulling="2025-11-27 11:30:56.238097155 +0000 UTC m=+1297.337595353" lastFinishedPulling="2025-11-27 11:30:56.682790592 +0000 UTC m=+1297.782288790" observedRunningTime="2025-11-27 11:30:57.296898045 +0000 UTC m=+1298.396396253" watchObservedRunningTime="2025-11-27 11:30:57.302294225 +0000 UTC m=+1298.401792443" Nov 27 11:31:00 crc kubenswrapper[4807]: I1127 11:31:00.314824 4807 generic.go:334] "Generic (PLEG): container finished" podID="edca8731-6d7e-44e5-b2a3-8622578409df" containerID="c64131508d7195d506a9f78dfc35cce8b06e6396a1973887b05af8764f9ee91e" exitCode=0 Nov 27 11:31:00 crc kubenswrapper[4807]: I1127 11:31:00.314919 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" event={"ID":"edca8731-6d7e-44e5-b2a3-8622578409df","Type":"ContainerDied","Data":"c64131508d7195d506a9f78dfc35cce8b06e6396a1973887b05af8764f9ee91e"} Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.181489 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.790889 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.890211 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-ssh-key\") pod \"edca8731-6d7e-44e5-b2a3-8622578409df\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.890523 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kww7b\" (UniqueName: \"kubernetes.io/projected/edca8731-6d7e-44e5-b2a3-8622578409df-kube-api-access-kww7b\") pod \"edca8731-6d7e-44e5-b2a3-8622578409df\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.891011 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-inventory\") pod \"edca8731-6d7e-44e5-b2a3-8622578409df\" (UID: \"edca8731-6d7e-44e5-b2a3-8622578409df\") " Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.903471 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edca8731-6d7e-44e5-b2a3-8622578409df-kube-api-access-kww7b" (OuterVolumeSpecName: "kube-api-access-kww7b") pod "edca8731-6d7e-44e5-b2a3-8622578409df" (UID: "edca8731-6d7e-44e5-b2a3-8622578409df"). InnerVolumeSpecName "kube-api-access-kww7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.918131 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-inventory" (OuterVolumeSpecName: "inventory") pod "edca8731-6d7e-44e5-b2a3-8622578409df" (UID: "edca8731-6d7e-44e5-b2a3-8622578409df"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.921498 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "edca8731-6d7e-44e5-b2a3-8622578409df" (UID: "edca8731-6d7e-44e5-b2a3-8622578409df"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.993374 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.993409 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/edca8731-6d7e-44e5-b2a3-8622578409df-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:31:01 crc kubenswrapper[4807]: I1127 11:31:01.993421 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kww7b\" (UniqueName: \"kubernetes.io/projected/edca8731-6d7e-44e5-b2a3-8622578409df-kube-api-access-kww7b\") on node \"crc\" DevicePath \"\"" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.163381 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.350260 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" event={"ID":"edca8731-6d7e-44e5-b2a3-8622578409df","Type":"ContainerDied","Data":"77606fd8f5f36a8345a94094f7f58074356d4925fa9d11aab979e8d2cdc32374"} Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.350298 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77606fd8f5f36a8345a94094f7f58074356d4925fa9d11aab979e8d2cdc32374" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.350343 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-xwp2h" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.437692 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb"] Nov 27 11:31:02 crc kubenswrapper[4807]: E1127 11:31:02.438072 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edca8731-6d7e-44e5-b2a3-8622578409df" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.438088 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="edca8731-6d7e-44e5-b2a3-8622578409df" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.440062 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="edca8731-6d7e-44e5-b2a3-8622578409df" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.441339 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.449091 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.453879 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.454447 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.454607 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.490766 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb"] Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.504591 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.504725 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7g2n\" (UniqueName: \"kubernetes.io/projected/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-kube-api-access-n7g2n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.504775 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.504835 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.606546 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.606666 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.606781 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.606885 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7g2n\" (UniqueName: \"kubernetes.io/projected/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-kube-api-access-n7g2n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.610015 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.613547 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.615903 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.623763 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7g2n\" (UniqueName: \"kubernetes.io/projected/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-kube-api-access-n7g2n\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:02 crc kubenswrapper[4807]: I1127 11:31:02.763713 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:31:03 crc kubenswrapper[4807]: I1127 11:31:03.297376 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb"] Nov 27 11:31:03 crc kubenswrapper[4807]: I1127 11:31:03.359616 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" event={"ID":"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa","Type":"ContainerStarted","Data":"02861a4c108388d00374153fdb7f140af5af1c665ec8a143276302afee11913b"} Nov 27 11:31:04 crc kubenswrapper[4807]: I1127 11:31:04.369866 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" event={"ID":"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa","Type":"ContainerStarted","Data":"a2062a7daed3324fde03dd146c41ac932150d3c208e869f723ac11daee7f7b06"} Nov 27 11:32:23 crc kubenswrapper[4807]: I1127 11:32:23.738051 4807 scope.go:117] "RemoveContainer" containerID="5f0391f8213a2b3e2666161297485f2946f5f082a36ee68c584d63c5c389296b" Nov 27 11:32:23 crc kubenswrapper[4807]: I1127 11:32:23.765925 4807 scope.go:117] "RemoveContainer" containerID="e395946d5b7983decac1d895d83ca3d8f3b7b0b1ee7634702f1bac34d0480104" Nov 27 11:32:50 crc kubenswrapper[4807]: I1127 11:32:50.921553 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:32:50 crc kubenswrapper[4807]: I1127 11:32:50.922023 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:33:20 crc kubenswrapper[4807]: I1127 11:33:20.921437 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:33:20 crc kubenswrapper[4807]: I1127 11:33:20.922141 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:33:50 crc kubenswrapper[4807]: I1127 11:33:50.921763 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:33:50 crc kubenswrapper[4807]: I1127 11:33:50.922302 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:33:50 crc kubenswrapper[4807]: I1127 11:33:50.922346 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:33:50 crc kubenswrapper[4807]: I1127 11:33:50.922923 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c29d8c1f4aa1a4d993b85b54355b8275690d350f55ec0d06dbc91b3680f8a870"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:33:50 crc kubenswrapper[4807]: I1127 11:33:50.922972 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://c29d8c1f4aa1a4d993b85b54355b8275690d350f55ec0d06dbc91b3680f8a870" gracePeriod=600 Nov 27 11:33:51 crc kubenswrapper[4807]: I1127 11:33:51.969136 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="c29d8c1f4aa1a4d993b85b54355b8275690d350f55ec0d06dbc91b3680f8a870" exitCode=0 Nov 27 11:33:51 crc kubenswrapper[4807]: I1127 11:33:51.969229 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"c29d8c1f4aa1a4d993b85b54355b8275690d350f55ec0d06dbc91b3680f8a870"} Nov 27 11:33:51 crc kubenswrapper[4807]: I1127 11:33:51.969594 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88"} Nov 27 11:33:51 crc kubenswrapper[4807]: I1127 11:33:51.969618 4807 scope.go:117] "RemoveContainer" containerID="039fe769a3645e4feb5875769fb8911726da6d6d91da20c8da03dd3106e1c39e" Nov 27 11:33:51 crc kubenswrapper[4807]: I1127 11:33:51.997205 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" podStartSLOduration=169.463565447 podStartE2EDuration="2m49.997182276s" podCreationTimestamp="2025-11-27 11:31:02 +0000 UTC" firstStartedPulling="2025-11-27 11:31:03.297924023 +0000 UTC m=+1304.397422221" lastFinishedPulling="2025-11-27 11:31:03.831540852 +0000 UTC m=+1304.931039050" observedRunningTime="2025-11-27 11:31:04.391711363 +0000 UTC m=+1305.491209561" watchObservedRunningTime="2025-11-27 11:33:51.997182276 +0000 UTC m=+1473.096680474" Nov 27 11:34:03 crc kubenswrapper[4807]: I1127 11:34:03.064199 4807 generic.go:334] "Generic (PLEG): container finished" podID="a94d3cc9-c680-4f54-a2b6-0f55690f4cfa" containerID="a2062a7daed3324fde03dd146c41ac932150d3c208e869f723ac11daee7f7b06" exitCode=0 Nov 27 11:34:03 crc kubenswrapper[4807]: I1127 11:34:03.064282 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" event={"ID":"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa","Type":"ContainerDied","Data":"a2062a7daed3324fde03dd146c41ac932150d3c208e869f723ac11daee7f7b06"} Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.495600 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.677104 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-ssh-key\") pod \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.677156 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-bootstrap-combined-ca-bundle\") pod \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.677197 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7g2n\" (UniqueName: \"kubernetes.io/projected/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-kube-api-access-n7g2n\") pod \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.677257 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-inventory\") pod \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\" (UID: \"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa\") " Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.683852 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a94d3cc9-c680-4f54-a2b6-0f55690f4cfa" (UID: "a94d3cc9-c680-4f54-a2b6-0f55690f4cfa"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.690965 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-kube-api-access-n7g2n" (OuterVolumeSpecName: "kube-api-access-n7g2n") pod "a94d3cc9-c680-4f54-a2b6-0f55690f4cfa" (UID: "a94d3cc9-c680-4f54-a2b6-0f55690f4cfa"). InnerVolumeSpecName "kube-api-access-n7g2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.706827 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-inventory" (OuterVolumeSpecName: "inventory") pod "a94d3cc9-c680-4f54-a2b6-0f55690f4cfa" (UID: "a94d3cc9-c680-4f54-a2b6-0f55690f4cfa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.713785 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a94d3cc9-c680-4f54-a2b6-0f55690f4cfa" (UID: "a94d3cc9-c680-4f54-a2b6-0f55690f4cfa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.778189 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.778420 4807 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.778433 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7g2n\" (UniqueName: \"kubernetes.io/projected/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-kube-api-access-n7g2n\") on node \"crc\" DevicePath \"\"" Nov 27 11:34:04 crc kubenswrapper[4807]: I1127 11:34:04.778442 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a94d3cc9-c680-4f54-a2b6-0f55690f4cfa-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.081475 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" event={"ID":"a94d3cc9-c680-4f54-a2b6-0f55690f4cfa","Type":"ContainerDied","Data":"02861a4c108388d00374153fdb7f140af5af1c665ec8a143276302afee11913b"} Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.081517 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02861a4c108388d00374153fdb7f140af5af1c665ec8a143276302afee11913b" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.081607 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.163362 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x"] Nov 27 11:34:05 crc kubenswrapper[4807]: E1127 11:34:05.163850 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94d3cc9-c680-4f54-a2b6-0f55690f4cfa" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.163868 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94d3cc9-c680-4f54-a2b6-0f55690f4cfa" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.164054 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94d3cc9-c680-4f54-a2b6-0f55690f4cfa" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.164710 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.167492 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.167893 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.168041 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.168149 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.176112 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x"] Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.200446 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.200552 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djg7m\" (UniqueName: \"kubernetes.io/projected/7cfe00fa-307e-460b-a77e-a57439954c87-kube-api-access-djg7m\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.200641 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.302491 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djg7m\" (UniqueName: \"kubernetes.io/projected/7cfe00fa-307e-460b-a77e-a57439954c87-kube-api-access-djg7m\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.302594 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.302697 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.307559 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.307787 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.320213 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djg7m\" (UniqueName: \"kubernetes.io/projected/7cfe00fa-307e-460b-a77e-a57439954c87-kube-api-access-djg7m\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.488408 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:34:05 crc kubenswrapper[4807]: W1127 11:34:05.984907 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cfe00fa_307e_460b_a77e_a57439954c87.slice/crio-240e4a5a0b290411672ddc15eea2bcafb99723ee18381b08ab11208b11574fd3 WatchSource:0}: Error finding container 240e4a5a0b290411672ddc15eea2bcafb99723ee18381b08ab11208b11574fd3: Status 404 returned error can't find the container with id 240e4a5a0b290411672ddc15eea2bcafb99723ee18381b08ab11208b11574fd3 Nov 27 11:34:05 crc kubenswrapper[4807]: I1127 11:34:05.986017 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x"] Nov 27 11:34:06 crc kubenswrapper[4807]: I1127 11:34:06.102541 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" event={"ID":"7cfe00fa-307e-460b-a77e-a57439954c87","Type":"ContainerStarted","Data":"240e4a5a0b290411672ddc15eea2bcafb99723ee18381b08ab11208b11574fd3"} Nov 27 11:34:07 crc kubenswrapper[4807]: I1127 11:34:07.110513 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" event={"ID":"7cfe00fa-307e-460b-a77e-a57439954c87","Type":"ContainerStarted","Data":"a658dfed16bb714c6cac2ff63947d595cca903c8cc8108421d36fe4542ff7c92"} Nov 27 11:34:07 crc kubenswrapper[4807]: I1127 11:34:07.135532 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" podStartSLOduration=1.6070373820000001 podStartE2EDuration="2.135514697s" podCreationTimestamp="2025-11-27 11:34:05 +0000 UTC" firstStartedPulling="2025-11-27 11:34:05.987102642 +0000 UTC m=+1487.086600840" lastFinishedPulling="2025-11-27 11:34:06.515579967 +0000 UTC m=+1487.615078155" observedRunningTime="2025-11-27 11:34:07.122780573 +0000 UTC m=+1488.222278791" watchObservedRunningTime="2025-11-27 11:34:07.135514697 +0000 UTC m=+1488.235012895" Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.701633 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mf9jf"] Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.737012 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mf9jf"] Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.737116 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.819865 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-utilities\") pod \"redhat-operators-mf9jf\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.819922 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn2xd\" (UniqueName: \"kubernetes.io/projected/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-kube-api-access-xn2xd\") pod \"redhat-operators-mf9jf\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.820088 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-catalog-content\") pod \"redhat-operators-mf9jf\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.921411 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-catalog-content\") pod \"redhat-operators-mf9jf\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.921553 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-utilities\") pod \"redhat-operators-mf9jf\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.921589 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn2xd\" (UniqueName: \"kubernetes.io/projected/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-kube-api-access-xn2xd\") pod \"redhat-operators-mf9jf\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.921899 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-catalog-content\") pod \"redhat-operators-mf9jf\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.922532 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-utilities\") pod \"redhat-operators-mf9jf\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:34:59 crc kubenswrapper[4807]: I1127 11:34:59.944220 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn2xd\" (UniqueName: \"kubernetes.io/projected/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-kube-api-access-xn2xd\") pod \"redhat-operators-mf9jf\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:35:00 crc kubenswrapper[4807]: I1127 11:35:00.058964 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:35:00 crc kubenswrapper[4807]: I1127 11:35:00.542338 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mf9jf"] Nov 27 11:35:00 crc kubenswrapper[4807]: W1127 11:35:00.544466 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f3d6bb_1c61_4a63_b0d5_1fe8a36a2293.slice/crio-82fbcdb176100179de8fff043c4e2b92127e3e24b26ec9762d196e5f161a5cab WatchSource:0}: Error finding container 82fbcdb176100179de8fff043c4e2b92127e3e24b26ec9762d196e5f161a5cab: Status 404 returned error can't find the container with id 82fbcdb176100179de8fff043c4e2b92127e3e24b26ec9762d196e5f161a5cab Nov 27 11:35:00 crc kubenswrapper[4807]: I1127 11:35:00.596474 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mf9jf" event={"ID":"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293","Type":"ContainerStarted","Data":"82fbcdb176100179de8fff043c4e2b92127e3e24b26ec9762d196e5f161a5cab"} Nov 27 11:35:01 crc kubenswrapper[4807]: I1127 11:35:01.605580 4807 generic.go:334] "Generic (PLEG): container finished" podID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerID="06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c" exitCode=0 Nov 27 11:35:01 crc kubenswrapper[4807]: I1127 11:35:01.605660 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mf9jf" event={"ID":"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293","Type":"ContainerDied","Data":"06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c"} Nov 27 11:35:01 crc kubenswrapper[4807]: I1127 11:35:01.608159 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 11:35:03 crc kubenswrapper[4807]: I1127 11:35:03.636898 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mf9jf" event={"ID":"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293","Type":"ContainerStarted","Data":"422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b"} Nov 27 11:35:04 crc kubenswrapper[4807]: I1127 11:35:04.649196 4807 generic.go:334] "Generic (PLEG): container finished" podID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerID="422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b" exitCode=0 Nov 27 11:35:04 crc kubenswrapper[4807]: I1127 11:35:04.649323 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mf9jf" event={"ID":"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293","Type":"ContainerDied","Data":"422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b"} Nov 27 11:35:06 crc kubenswrapper[4807]: I1127 11:35:06.668045 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mf9jf" event={"ID":"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293","Type":"ContainerStarted","Data":"236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349"} Nov 27 11:35:06 crc kubenswrapper[4807]: I1127 11:35:06.695939 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mf9jf" podStartSLOduration=3.826638754 podStartE2EDuration="7.695903268s" podCreationTimestamp="2025-11-27 11:34:59 +0000 UTC" firstStartedPulling="2025-11-27 11:35:01.607876565 +0000 UTC m=+1542.707374763" lastFinishedPulling="2025-11-27 11:35:05.477141059 +0000 UTC m=+1546.576639277" observedRunningTime="2025-11-27 11:35:06.685025303 +0000 UTC m=+1547.784523501" watchObservedRunningTime="2025-11-27 11:35:06.695903268 +0000 UTC m=+1547.795401516" Nov 27 11:35:10 crc kubenswrapper[4807]: I1127 11:35:10.059515 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:35:10 crc kubenswrapper[4807]: I1127 11:35:10.059810 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:35:11 crc kubenswrapper[4807]: I1127 11:35:11.118338 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mf9jf" podUID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerName="registry-server" probeResult="failure" output=< Nov 27 11:35:11 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Nov 27 11:35:11 crc kubenswrapper[4807]: > Nov 27 11:35:20 crc kubenswrapper[4807]: I1127 11:35:20.106626 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:35:20 crc kubenswrapper[4807]: I1127 11:35:20.156502 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:35:20 crc kubenswrapper[4807]: I1127 11:35:20.342540 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mf9jf"] Nov 27 11:35:21 crc kubenswrapper[4807]: I1127 11:35:21.809833 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mf9jf" podUID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerName="registry-server" containerID="cri-o://236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349" gracePeriod=2 Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.235293 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.297153 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-utilities\") pod \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.297295 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn2xd\" (UniqueName: \"kubernetes.io/projected/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-kube-api-access-xn2xd\") pod \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.297403 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-catalog-content\") pod \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\" (UID: \"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293\") " Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.298042 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-utilities" (OuterVolumeSpecName: "utilities") pod "b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" (UID: "b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.303373 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-kube-api-access-xn2xd" (OuterVolumeSpecName: "kube-api-access-xn2xd") pod "b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" (UID: "b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293"). InnerVolumeSpecName "kube-api-access-xn2xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.386384 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" (UID: "b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.399890 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.399923 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.399938 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn2xd\" (UniqueName: \"kubernetes.io/projected/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293-kube-api-access-xn2xd\") on node \"crc\" DevicePath \"\"" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.820020 4807 generic.go:334] "Generic (PLEG): container finished" podID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerID="236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349" exitCode=0 Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.820070 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mf9jf" event={"ID":"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293","Type":"ContainerDied","Data":"236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349"} Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.820091 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mf9jf" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.820100 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mf9jf" event={"ID":"b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293","Type":"ContainerDied","Data":"82fbcdb176100179de8fff043c4e2b92127e3e24b26ec9762d196e5f161a5cab"} Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.820120 4807 scope.go:117] "RemoveContainer" containerID="236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.855690 4807 scope.go:117] "RemoveContainer" containerID="422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.867837 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mf9jf"] Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.879752 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mf9jf"] Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.880794 4807 scope.go:117] "RemoveContainer" containerID="06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.919058 4807 scope.go:117] "RemoveContainer" containerID="236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349" Nov 27 11:35:22 crc kubenswrapper[4807]: E1127 11:35:22.919538 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349\": container with ID starting with 236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349 not found: ID does not exist" containerID="236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.919575 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349"} err="failed to get container status \"236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349\": rpc error: code = NotFound desc = could not find container \"236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349\": container with ID starting with 236abd3d35e7e59d2bd781fa231da5477f8e5531a3322e64e4af5912311e6349 not found: ID does not exist" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.919601 4807 scope.go:117] "RemoveContainer" containerID="422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b" Nov 27 11:35:22 crc kubenswrapper[4807]: E1127 11:35:22.919970 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b\": container with ID starting with 422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b not found: ID does not exist" containerID="422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.920003 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b"} err="failed to get container status \"422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b\": rpc error: code = NotFound desc = could not find container \"422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b\": container with ID starting with 422a7a13781109c130c855e697ec76f1b186603d45eaec93d9255d3e4fbfe56b not found: ID does not exist" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.920030 4807 scope.go:117] "RemoveContainer" containerID="06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c" Nov 27 11:35:22 crc kubenswrapper[4807]: E1127 11:35:22.920235 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c\": container with ID starting with 06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c not found: ID does not exist" containerID="06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c" Nov 27 11:35:22 crc kubenswrapper[4807]: I1127 11:35:22.920272 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c"} err="failed to get container status \"06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c\": rpc error: code = NotFound desc = could not find container \"06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c\": container with ID starting with 06c245efb859b62f76be64da5fa5aea7218254ea3e03c3ed736c014daf758c4c not found: ID does not exist" Nov 27 11:35:23 crc kubenswrapper[4807]: I1127 11:35:23.543589 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" path="/var/lib/kubelet/pods/b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293/volumes" Nov 27 11:35:23 crc kubenswrapper[4807]: I1127 11:35:23.911369 4807 scope.go:117] "RemoveContainer" containerID="e53072ecd3e796bc06b4588e2133c288edd5bec0d412c409df34c7db9aec7b44" Nov 27 11:35:23 crc kubenswrapper[4807]: I1127 11:35:23.933724 4807 scope.go:117] "RemoveContainer" containerID="20cec08b68acb488954f0cd2e2f3dc2aa597baeadf9aae28d7edda1d29d5c51a" Nov 27 11:35:31 crc kubenswrapper[4807]: I1127 11:35:31.915052 4807 generic.go:334] "Generic (PLEG): container finished" podID="7cfe00fa-307e-460b-a77e-a57439954c87" containerID="a658dfed16bb714c6cac2ff63947d595cca903c8cc8108421d36fe4542ff7c92" exitCode=0 Nov 27 11:35:31 crc kubenswrapper[4807]: I1127 11:35:31.915144 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" event={"ID":"7cfe00fa-307e-460b-a77e-a57439954c87","Type":"ContainerDied","Data":"a658dfed16bb714c6cac2ff63947d595cca903c8cc8108421d36fe4542ff7c92"} Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.569794 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zhvcp"] Nov 27 11:35:32 crc kubenswrapper[4807]: E1127 11:35:32.570345 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerName="extract-utilities" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.570365 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerName="extract-utilities" Nov 27 11:35:32 crc kubenswrapper[4807]: E1127 11:35:32.570388 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerName="registry-server" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.570399 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerName="registry-server" Nov 27 11:35:32 crc kubenswrapper[4807]: E1127 11:35:32.570423 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerName="extract-content" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.570429 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerName="extract-content" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.570630 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f3d6bb-1c61-4a63-b0d5-1fe8a36a2293" containerName="registry-server" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.572055 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.586361 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zhvcp"] Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.601635 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftgz4\" (UniqueName: \"kubernetes.io/projected/f9c60a53-ed67-4268-b9b1-95df2f498f8a-kube-api-access-ftgz4\") pod \"community-operators-zhvcp\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.601732 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-utilities\") pod \"community-operators-zhvcp\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.601864 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-catalog-content\") pod \"community-operators-zhvcp\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.704002 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-catalog-content\") pod \"community-operators-zhvcp\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.704140 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftgz4\" (UniqueName: \"kubernetes.io/projected/f9c60a53-ed67-4268-b9b1-95df2f498f8a-kube-api-access-ftgz4\") pod \"community-operators-zhvcp\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.704174 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-utilities\") pod \"community-operators-zhvcp\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.704658 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-utilities\") pod \"community-operators-zhvcp\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.704829 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-catalog-content\") pod \"community-operators-zhvcp\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.725454 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftgz4\" (UniqueName: \"kubernetes.io/projected/f9c60a53-ed67-4268-b9b1-95df2f498f8a-kube-api-access-ftgz4\") pod \"community-operators-zhvcp\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:32 crc kubenswrapper[4807]: I1127 11:35:32.891377 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:34 crc kubenswrapper[4807]: I1127 11:35:34.719648 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zhvcp"] Nov 27 11:35:34 crc kubenswrapper[4807]: I1127 11:35:34.910727 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:35:34 crc kubenswrapper[4807]: I1127 11:35:34.944344 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhvcp" event={"ID":"f9c60a53-ed67-4268-b9b1-95df2f498f8a","Type":"ContainerStarted","Data":"c8eb18a702d07021b6729f388ce9e588ee0c8f2d2115621b6c7a0eadde8c649f"} Nov 27 11:35:34 crc kubenswrapper[4807]: I1127 11:35:34.945371 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" event={"ID":"7cfe00fa-307e-460b-a77e-a57439954c87","Type":"ContainerDied","Data":"240e4a5a0b290411672ddc15eea2bcafb99723ee18381b08ab11208b11574fd3"} Nov 27 11:35:34 crc kubenswrapper[4807]: I1127 11:35:34.945393 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="240e4a5a0b290411672ddc15eea2bcafb99723ee18381b08ab11208b11574fd3" Nov 27 11:35:34 crc kubenswrapper[4807]: I1127 11:35:34.945477 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x" Nov 27 11:35:34 crc kubenswrapper[4807]: I1127 11:35:34.981039 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-ssh-key\") pod \"7cfe00fa-307e-460b-a77e-a57439954c87\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " Nov 27 11:35:34 crc kubenswrapper[4807]: I1127 11:35:34.981132 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djg7m\" (UniqueName: \"kubernetes.io/projected/7cfe00fa-307e-460b-a77e-a57439954c87-kube-api-access-djg7m\") pod \"7cfe00fa-307e-460b-a77e-a57439954c87\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " Nov 27 11:35:34 crc kubenswrapper[4807]: I1127 11:35:34.981308 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-inventory\") pod \"7cfe00fa-307e-460b-a77e-a57439954c87\" (UID: \"7cfe00fa-307e-460b-a77e-a57439954c87\") " Nov 27 11:35:34 crc kubenswrapper[4807]: I1127 11:35:34.987924 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cfe00fa-307e-460b-a77e-a57439954c87-kube-api-access-djg7m" (OuterVolumeSpecName: "kube-api-access-djg7m") pod "7cfe00fa-307e-460b-a77e-a57439954c87" (UID: "7cfe00fa-307e-460b-a77e-a57439954c87"). InnerVolumeSpecName "kube-api-access-djg7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:35:35 crc kubenswrapper[4807]: I1127 11:35:35.011531 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7cfe00fa-307e-460b-a77e-a57439954c87" (UID: "7cfe00fa-307e-460b-a77e-a57439954c87"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:35:35 crc kubenswrapper[4807]: I1127 11:35:35.014152 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-inventory" (OuterVolumeSpecName: "inventory") pod "7cfe00fa-307e-460b-a77e-a57439954c87" (UID: "7cfe00fa-307e-460b-a77e-a57439954c87"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:35:35 crc kubenswrapper[4807]: I1127 11:35:35.083645 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djg7m\" (UniqueName: \"kubernetes.io/projected/7cfe00fa-307e-460b-a77e-a57439954c87-kube-api-access-djg7m\") on node \"crc\" DevicePath \"\"" Nov 27 11:35:35 crc kubenswrapper[4807]: I1127 11:35:35.083688 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:35:35 crc kubenswrapper[4807]: I1127 11:35:35.083701 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7cfe00fa-307e-460b-a77e-a57439954c87-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:35:35 crc kubenswrapper[4807]: I1127 11:35:35.961313 4807 generic.go:334] "Generic (PLEG): container finished" podID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerID="27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9" exitCode=0 Nov 27 11:35:35 crc kubenswrapper[4807]: I1127 11:35:35.961399 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhvcp" event={"ID":"f9c60a53-ed67-4268-b9b1-95df2f498f8a","Type":"ContainerDied","Data":"27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9"} Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.027062 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std"] Nov 27 11:35:36 crc kubenswrapper[4807]: E1127 11:35:36.027522 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cfe00fa-307e-460b-a77e-a57439954c87" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.027542 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfe00fa-307e-460b-a77e-a57439954c87" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.027722 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cfe00fa-307e-460b-a77e-a57439954c87" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.028364 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.033118 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.033118 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.033175 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.033648 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.039583 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std"] Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.102568 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-64std\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.102620 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-64std\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.102786 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kr4v\" (UniqueName: \"kubernetes.io/projected/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-kube-api-access-5kr4v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-64std\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.204387 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-64std\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.204812 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kr4v\" (UniqueName: \"kubernetes.io/projected/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-kube-api-access-5kr4v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-64std\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.204885 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-64std\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.209783 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-64std\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.210331 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-64std\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.221329 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kr4v\" (UniqueName: \"kubernetes.io/projected/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-kube-api-access-5kr4v\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-64std\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.351336 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.695492 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std"] Nov 27 11:35:36 crc kubenswrapper[4807]: W1127 11:35:36.701527 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod193a1dcb_8e1d_4c2b_be8c_92a94a5dfb9d.slice/crio-e00a4a00c997d4f46d17681981986d0c8f0786cba6e462b4b87cfb408ffc1c63 WatchSource:0}: Error finding container e00a4a00c997d4f46d17681981986d0c8f0786cba6e462b4b87cfb408ffc1c63: Status 404 returned error can't find the container with id e00a4a00c997d4f46d17681981986d0c8f0786cba6e462b4b87cfb408ffc1c63 Nov 27 11:35:36 crc kubenswrapper[4807]: I1127 11:35:36.970090 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" event={"ID":"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d","Type":"ContainerStarted","Data":"e00a4a00c997d4f46d17681981986d0c8f0786cba6e462b4b87cfb408ffc1c63"} Nov 27 11:35:37 crc kubenswrapper[4807]: I1127 11:35:37.981405 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhvcp" event={"ID":"f9c60a53-ed67-4268-b9b1-95df2f498f8a","Type":"ContainerStarted","Data":"d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb"} Nov 27 11:35:37 crc kubenswrapper[4807]: I1127 11:35:37.984009 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" event={"ID":"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d","Type":"ContainerStarted","Data":"5b807763da91765a817a348ef0c96993602ad334ff3e6881c0db7db8c8ffd6db"} Nov 27 11:35:38 crc kubenswrapper[4807]: I1127 11:35:38.024559 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" podStartSLOduration=2.544700689 podStartE2EDuration="3.024541335s" podCreationTimestamp="2025-11-27 11:35:35 +0000 UTC" firstStartedPulling="2025-11-27 11:35:36.703317638 +0000 UTC m=+1577.802815836" lastFinishedPulling="2025-11-27 11:35:37.183158284 +0000 UTC m=+1578.282656482" observedRunningTime="2025-11-27 11:35:38.024017421 +0000 UTC m=+1579.123515629" watchObservedRunningTime="2025-11-27 11:35:38.024541335 +0000 UTC m=+1579.124039533" Nov 27 11:35:38 crc kubenswrapper[4807]: I1127 11:35:38.996505 4807 generic.go:334] "Generic (PLEG): container finished" podID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerID="d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb" exitCode=0 Nov 27 11:35:38 crc kubenswrapper[4807]: I1127 11:35:38.996569 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhvcp" event={"ID":"f9c60a53-ed67-4268-b9b1-95df2f498f8a","Type":"ContainerDied","Data":"d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb"} Nov 27 11:35:40 crc kubenswrapper[4807]: I1127 11:35:40.008669 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhvcp" event={"ID":"f9c60a53-ed67-4268-b9b1-95df2f498f8a","Type":"ContainerStarted","Data":"7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20"} Nov 27 11:35:40 crc kubenswrapper[4807]: I1127 11:35:40.034926 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zhvcp" podStartSLOduration=4.392752072 podStartE2EDuration="8.034909949s" podCreationTimestamp="2025-11-27 11:35:32 +0000 UTC" firstStartedPulling="2025-11-27 11:35:35.964614371 +0000 UTC m=+1577.064112589" lastFinishedPulling="2025-11-27 11:35:39.606772278 +0000 UTC m=+1580.706270466" observedRunningTime="2025-11-27 11:35:40.029747734 +0000 UTC m=+1581.129245932" watchObservedRunningTime="2025-11-27 11:35:40.034909949 +0000 UTC m=+1581.134408147" Nov 27 11:35:42 crc kubenswrapper[4807]: I1127 11:35:42.892919 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:42 crc kubenswrapper[4807]: I1127 11:35:42.893235 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:42 crc kubenswrapper[4807]: I1127 11:35:42.937548 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:47 crc kubenswrapper[4807]: I1127 11:35:47.042470 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-79e1-account-create-update-cjvzt"] Nov 27 11:35:47 crc kubenswrapper[4807]: I1127 11:35:47.053084 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9fmgz"] Nov 27 11:35:47 crc kubenswrapper[4807]: I1127 11:35:47.062484 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-79e1-account-create-update-cjvzt"] Nov 27 11:35:47 crc kubenswrapper[4807]: I1127 11:35:47.070567 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9fmgz"] Nov 27 11:35:47 crc kubenswrapper[4807]: I1127 11:35:47.555996 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ca4c18-464b-4f06-8dee-dae2b044e9a5" path="/var/lib/kubelet/pods/c0ca4c18-464b-4f06-8dee-dae2b044e9a5/volumes" Nov 27 11:35:47 crc kubenswrapper[4807]: I1127 11:35:47.557486 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d413ebfd-893d-437a-b034-72425fa40d8a" path="/var/lib/kubelet/pods/d413ebfd-893d-437a-b034-72425fa40d8a/volumes" Nov 27 11:35:50 crc kubenswrapper[4807]: I1127 11:35:50.030018 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-j5bh8"] Nov 27 11:35:50 crc kubenswrapper[4807]: I1127 11:35:50.045501 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-8abf-account-create-update-kvbpv"] Nov 27 11:35:50 crc kubenswrapper[4807]: I1127 11:35:50.054043 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-j5bh8"] Nov 27 11:35:50 crc kubenswrapper[4807]: I1127 11:35:50.062653 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-8abf-account-create-update-kvbpv"] Nov 27 11:35:50 crc kubenswrapper[4807]: I1127 11:35:50.071858 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fchf8"] Nov 27 11:35:50 crc kubenswrapper[4807]: I1127 11:35:50.079643 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fchf8"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.036167 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-b10b-account-create-update-9ck9q"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.045279 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fb85-account-create-update-fsqpx"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.055004 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8534-account-create-update-q2bbw"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.065332 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-6vqr9"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.073662 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3b55-account-create-update-q96ln"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.082102 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-b10b-account-create-update-9ck9q"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.089188 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fb85-account-create-update-fsqpx"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.097145 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-6vqr9"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.104079 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3b55-account-create-update-q96ln"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.111259 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8534-account-create-update-q2bbw"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.118036 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-4qq8v"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.125145 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-c8cvq"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.131799 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-4qq8v"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.139007 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-c8cvq"] Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.544431 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b9f61b-2731-4391-9be9-56437aa380e7" path="/var/lib/kubelet/pods/00b9f61b-2731-4391-9be9-56437aa380e7/volumes" Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.545463 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7" path="/var/lib/kubelet/pods/0338d3d8-1b48-4de4-9a8c-fb3f3dc3eef7/volumes" Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.546101 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df1ead0-bf09-4f2f-af65-52c27e8e750e" path="/var/lib/kubelet/pods/1df1ead0-bf09-4f2f-af65-52c27e8e750e/volumes" Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.546652 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a128226-2a88-4b86-8592-52eb97146227" path="/var/lib/kubelet/pods/4a128226-2a88-4b86-8592-52eb97146227/volumes" Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.547668 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6fdb88-9641-4e76-9418-7deae0ca555a" path="/var/lib/kubelet/pods/7e6fdb88-9641-4e76-9418-7deae0ca555a/volumes" Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.548158 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ab6613c-de00-44b8-9a65-cff272bb76e8" path="/var/lib/kubelet/pods/8ab6613c-de00-44b8-9a65-cff272bb76e8/volumes" Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.548706 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61fd3b7-c996-4009-8e5a-208bba68066f" path="/var/lib/kubelet/pods/b61fd3b7-c996-4009-8e5a-208bba68066f/volumes" Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.549647 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86" path="/var/lib/kubelet/pods/cf612f8a-d4ca-44fa-bf1d-6ea476dd0c86/volumes" Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.550195 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d67782cf-87a3-44fe-88c8-16d66f224029" path="/var/lib/kubelet/pods/d67782cf-87a3-44fe-88c8-16d66f224029/volumes" Nov 27 11:35:51 crc kubenswrapper[4807]: I1127 11:35:51.550722 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d2a7ab-579e-4690-8067-7ba5c08cf3c9" path="/var/lib/kubelet/pods/e6d2a7ab-579e-4690-8067-7ba5c08cf3c9/volumes" Nov 27 11:35:52 crc kubenswrapper[4807]: I1127 11:35:52.943028 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.022786 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zhvcp"] Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.130437 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zhvcp" podUID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerName="registry-server" containerID="cri-o://7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20" gracePeriod=2 Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.584201 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.644536 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-utilities\") pod \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.644745 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-catalog-content\") pod \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.644812 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftgz4\" (UniqueName: \"kubernetes.io/projected/f9c60a53-ed67-4268-b9b1-95df2f498f8a-kube-api-access-ftgz4\") pod \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\" (UID: \"f9c60a53-ed67-4268-b9b1-95df2f498f8a\") " Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.645469 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-utilities" (OuterVolumeSpecName: "utilities") pod "f9c60a53-ed67-4268-b9b1-95df2f498f8a" (UID: "f9c60a53-ed67-4268-b9b1-95df2f498f8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.646328 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.649991 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c60a53-ed67-4268-b9b1-95df2f498f8a-kube-api-access-ftgz4" (OuterVolumeSpecName: "kube-api-access-ftgz4") pod "f9c60a53-ed67-4268-b9b1-95df2f498f8a" (UID: "f9c60a53-ed67-4268-b9b1-95df2f498f8a"). InnerVolumeSpecName "kube-api-access-ftgz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.699564 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9c60a53-ed67-4268-b9b1-95df2f498f8a" (UID: "f9c60a53-ed67-4268-b9b1-95df2f498f8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.747874 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9c60a53-ed67-4268-b9b1-95df2f498f8a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:35:53 crc kubenswrapper[4807]: I1127 11:35:53.747906 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftgz4\" (UniqueName: \"kubernetes.io/projected/f9c60a53-ed67-4268-b9b1-95df2f498f8a-kube-api-access-ftgz4\") on node \"crc\" DevicePath \"\"" Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.140542 4807 generic.go:334] "Generic (PLEG): container finished" podID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerID="7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20" exitCode=0 Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.140582 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhvcp" event={"ID":"f9c60a53-ed67-4268-b9b1-95df2f498f8a","Type":"ContainerDied","Data":"7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20"} Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.140605 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zhvcp" event={"ID":"f9c60a53-ed67-4268-b9b1-95df2f498f8a","Type":"ContainerDied","Data":"c8eb18a702d07021b6729f388ce9e588ee0c8f2d2115621b6c7a0eadde8c649f"} Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.140621 4807 scope.go:117] "RemoveContainer" containerID="7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20" Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.140643 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zhvcp" Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.177856 4807 scope.go:117] "RemoveContainer" containerID="d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb" Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.179392 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zhvcp"] Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.190350 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zhvcp"] Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.208306 4807 scope.go:117] "RemoveContainer" containerID="27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9" Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.254750 4807 scope.go:117] "RemoveContainer" containerID="7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20" Nov 27 11:35:54 crc kubenswrapper[4807]: E1127 11:35:54.255291 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20\": container with ID starting with 7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20 not found: ID does not exist" containerID="7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20" Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.255334 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20"} err="failed to get container status \"7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20\": rpc error: code = NotFound desc = could not find container \"7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20\": container with ID starting with 7f3a41214db66ec73906a36ef2b9ff29e3987aea74b785fea671f82d083f9a20 not found: ID does not exist" Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.255375 4807 scope.go:117] "RemoveContainer" containerID="d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb" Nov 27 11:35:54 crc kubenswrapper[4807]: E1127 11:35:54.256800 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb\": container with ID starting with d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb not found: ID does not exist" containerID="d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb" Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.256831 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb"} err="failed to get container status \"d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb\": rpc error: code = NotFound desc = could not find container \"d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb\": container with ID starting with d774fa28d553c92e07b48c557153ba41b0870a1ca3e8a4a90756a7fb8baeecfb not found: ID does not exist" Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.256850 4807 scope.go:117] "RemoveContainer" containerID="27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9" Nov 27 11:35:54 crc kubenswrapper[4807]: E1127 11:35:54.257171 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9\": container with ID starting with 27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9 not found: ID does not exist" containerID="27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9" Nov 27 11:35:54 crc kubenswrapper[4807]: I1127 11:35:54.257204 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9"} err="failed to get container status \"27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9\": rpc error: code = NotFound desc = could not find container \"27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9\": container with ID starting with 27b3e2821df085502f51b96bfbe6db069c16a1c0ef600de9b5c1e396edbed9a9 not found: ID does not exist" Nov 27 11:35:55 crc kubenswrapper[4807]: I1127 11:35:55.552618 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" path="/var/lib/kubelet/pods/f9c60a53-ed67-4268-b9b1-95df2f498f8a/volumes" Nov 27 11:36:15 crc kubenswrapper[4807]: I1127 11:36:15.934933 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fm"] Nov 27 11:36:15 crc kubenswrapper[4807]: E1127 11:36:15.936193 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerName="extract-content" Nov 27 11:36:15 crc kubenswrapper[4807]: I1127 11:36:15.936216 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerName="extract-content" Nov 27 11:36:15 crc kubenswrapper[4807]: E1127 11:36:15.936283 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerName="extract-utilities" Nov 27 11:36:15 crc kubenswrapper[4807]: I1127 11:36:15.936297 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerName="extract-utilities" Nov 27 11:36:15 crc kubenswrapper[4807]: E1127 11:36:15.936341 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerName="registry-server" Nov 27 11:36:15 crc kubenswrapper[4807]: I1127 11:36:15.936355 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerName="registry-server" Nov 27 11:36:15 crc kubenswrapper[4807]: I1127 11:36:15.936711 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c60a53-ed67-4268-b9b1-95df2f498f8a" containerName="registry-server" Nov 27 11:36:15 crc kubenswrapper[4807]: I1127 11:36:15.939150 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:15 crc kubenswrapper[4807]: I1127 11:36:15.970384 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-catalog-content\") pod \"redhat-marketplace-2m2fm\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:15 crc kubenswrapper[4807]: I1127 11:36:15.970615 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfdcr\" (UniqueName: \"kubernetes.io/projected/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-kube-api-access-kfdcr\") pod \"redhat-marketplace-2m2fm\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:15 crc kubenswrapper[4807]: I1127 11:36:15.970645 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-utilities\") pod \"redhat-marketplace-2m2fm\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:15 crc kubenswrapper[4807]: I1127 11:36:15.972133 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fm"] Nov 27 11:36:16 crc kubenswrapper[4807]: I1127 11:36:16.072413 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfdcr\" (UniqueName: \"kubernetes.io/projected/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-kube-api-access-kfdcr\") pod \"redhat-marketplace-2m2fm\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:16 crc kubenswrapper[4807]: I1127 11:36:16.072472 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-utilities\") pod \"redhat-marketplace-2m2fm\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:16 crc kubenswrapper[4807]: I1127 11:36:16.072543 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-catalog-content\") pod \"redhat-marketplace-2m2fm\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:16 crc kubenswrapper[4807]: I1127 11:36:16.073117 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-catalog-content\") pod \"redhat-marketplace-2m2fm\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:16 crc kubenswrapper[4807]: I1127 11:36:16.073501 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-utilities\") pod \"redhat-marketplace-2m2fm\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:16 crc kubenswrapper[4807]: I1127 11:36:16.103604 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfdcr\" (UniqueName: \"kubernetes.io/projected/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-kube-api-access-kfdcr\") pod \"redhat-marketplace-2m2fm\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:16 crc kubenswrapper[4807]: I1127 11:36:16.266952 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:16 crc kubenswrapper[4807]: I1127 11:36:16.728991 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fm"] Nov 27 11:36:17 crc kubenswrapper[4807]: I1127 11:36:17.365768 4807 generic.go:334] "Generic (PLEG): container finished" podID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerID="dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696" exitCode=0 Nov 27 11:36:17 crc kubenswrapper[4807]: I1127 11:36:17.365870 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fm" event={"ID":"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2","Type":"ContainerDied","Data":"dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696"} Nov 27 11:36:17 crc kubenswrapper[4807]: I1127 11:36:17.366055 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fm" event={"ID":"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2","Type":"ContainerStarted","Data":"24b2957de99f6f253b213303bcdc7017afbf794f616e85dbd2b32eb6c2e9d78d"} Nov 27 11:36:18 crc kubenswrapper[4807]: I1127 11:36:18.045312 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-67wm7"] Nov 27 11:36:18 crc kubenswrapper[4807]: I1127 11:36:18.055339 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xnw86"] Nov 27 11:36:18 crc kubenswrapper[4807]: I1127 11:36:18.065270 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xnw86"] Nov 27 11:36:18 crc kubenswrapper[4807]: I1127 11:36:18.073325 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-67wm7"] Nov 27 11:36:18 crc kubenswrapper[4807]: I1127 11:36:18.381459 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fm" event={"ID":"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2","Type":"ContainerStarted","Data":"75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364"} Nov 27 11:36:19 crc kubenswrapper[4807]: I1127 11:36:19.391689 4807 generic.go:334] "Generic (PLEG): container finished" podID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerID="75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364" exitCode=0 Nov 27 11:36:19 crc kubenswrapper[4807]: I1127 11:36:19.391739 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fm" event={"ID":"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2","Type":"ContainerDied","Data":"75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364"} Nov 27 11:36:19 crc kubenswrapper[4807]: I1127 11:36:19.552519 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="123dc36c-92d1-4c98-9c02-ed2c4fbbff27" path="/var/lib/kubelet/pods/123dc36c-92d1-4c98-9c02-ed2c4fbbff27/volumes" Nov 27 11:36:19 crc kubenswrapper[4807]: I1127 11:36:19.553682 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca7e9a2b-6e64-4497-8003-8c9aaaf37806" path="/var/lib/kubelet/pods/ca7e9a2b-6e64-4497-8003-8c9aaaf37806/volumes" Nov 27 11:36:20 crc kubenswrapper[4807]: I1127 11:36:20.921535 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:36:20 crc kubenswrapper[4807]: I1127 11:36:20.921889 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:36:21 crc kubenswrapper[4807]: I1127 11:36:21.410081 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fm" event={"ID":"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2","Type":"ContainerStarted","Data":"3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c"} Nov 27 11:36:21 crc kubenswrapper[4807]: I1127 11:36:21.431747 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2m2fm" podStartSLOduration=3.349897225 podStartE2EDuration="6.431730503s" podCreationTimestamp="2025-11-27 11:36:15 +0000 UTC" firstStartedPulling="2025-11-27 11:36:17.367152656 +0000 UTC m=+1618.466650854" lastFinishedPulling="2025-11-27 11:36:20.448985934 +0000 UTC m=+1621.548484132" observedRunningTime="2025-11-27 11:36:21.424498863 +0000 UTC m=+1622.523997061" watchObservedRunningTime="2025-11-27 11:36:21.431730503 +0000 UTC m=+1622.531228701" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.000548 4807 scope.go:117] "RemoveContainer" containerID="f964dd0eb744c72975313463cc7ca3d009eed12263d794eeeeae83172627b01a" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.022041 4807 scope.go:117] "RemoveContainer" containerID="bf9a3d8346831c0e092d078418588c9cb7623d129e78a924f34c5a64553c101a" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.066984 4807 scope.go:117] "RemoveContainer" containerID="c62dee7eb94cb18333febe98af1b8fa218551d8283dfd41d1d1dd1a2438e8c2b" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.113721 4807 scope.go:117] "RemoveContainer" containerID="76882f63bebb00e995acda55fe9e326376a9adaf2bb13424aafc6775bee1da48" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.185692 4807 scope.go:117] "RemoveContainer" containerID="0c33d586f8adc7e401025375dad3b043e4e8c7b55df0ceb3ec92cba470dba98f" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.211054 4807 scope.go:117] "RemoveContainer" containerID="c3cffb3485da8458f1f54cd6ab7161b732dfd9fa27f2432b58e43018cc6e472e" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.253355 4807 scope.go:117] "RemoveContainer" containerID="a0179e71faadcd429ddb3d2c5def194609426f5702bf4b4104c8b140249f676a" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.278129 4807 scope.go:117] "RemoveContainer" containerID="5e24e77cbbc21ed639c4dca77f9b0679b58e1f1a86059d1b2b33af534e10122e" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.299736 4807 scope.go:117] "RemoveContainer" containerID="fcd19fd8d50d0f57e1288a3a5bd8cff57feabd718395e7c3f92c2267c55b836a" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.324506 4807 scope.go:117] "RemoveContainer" containerID="ae61011a72922ed1bbce2fb5e97b93dfe335d75192538aecb2d89ddee6d09b84" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.343277 4807 scope.go:117] "RemoveContainer" containerID="f7d38e6347b03d6d3d12be71b78496063cdde1a25f864f1c82c2046c72edccca" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.360985 4807 scope.go:117] "RemoveContainer" containerID="a7027967779e3bb30645768cc67c2c1b522d99b74dad797fb4d4eca42d78889e" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.381276 4807 scope.go:117] "RemoveContainer" containerID="8483b45075dea8b3fed4078688781cfe06ebecd7ed015982417699173049577b" Nov 27 11:36:24 crc kubenswrapper[4807]: I1127 11:36:24.401705 4807 scope.go:117] "RemoveContainer" containerID="e9e442e566d906d46e1651604508bb54f67f9ec205754af36d294435cd4d572c" Nov 27 11:36:26 crc kubenswrapper[4807]: I1127 11:36:26.267220 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:26 crc kubenswrapper[4807]: I1127 11:36:26.267628 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:26 crc kubenswrapper[4807]: I1127 11:36:26.320829 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:26 crc kubenswrapper[4807]: I1127 11:36:26.531348 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:26 crc kubenswrapper[4807]: I1127 11:36:26.578629 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fm"] Nov 27 11:36:28 crc kubenswrapper[4807]: I1127 11:36:28.492438 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2m2fm" podUID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerName="registry-server" containerID="cri-o://3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c" gracePeriod=2 Nov 27 11:36:28 crc kubenswrapper[4807]: I1127 11:36:28.909182 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.019186 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-catalog-content\") pod \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.019326 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-utilities\") pod \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.019489 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfdcr\" (UniqueName: \"kubernetes.io/projected/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-kube-api-access-kfdcr\") pod \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\" (UID: \"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2\") " Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.020325 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-utilities" (OuterVolumeSpecName: "utilities") pod "b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" (UID: "b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.024995 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-kube-api-access-kfdcr" (OuterVolumeSpecName: "kube-api-access-kfdcr") pod "b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" (UID: "b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2"). InnerVolumeSpecName "kube-api-access-kfdcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.039836 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" (UID: "b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.121570 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.121804 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfdcr\" (UniqueName: \"kubernetes.io/projected/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-kube-api-access-kfdcr\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.121817 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.503993 4807 generic.go:334] "Generic (PLEG): container finished" podID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerID="3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c" exitCode=0 Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.504039 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fm" event={"ID":"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2","Type":"ContainerDied","Data":"3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c"} Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.504090 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2m2fm" event={"ID":"b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2","Type":"ContainerDied","Data":"24b2957de99f6f253b213303bcdc7017afbf794f616e85dbd2b32eb6c2e9d78d"} Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.504098 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2m2fm" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.504108 4807 scope.go:117] "RemoveContainer" containerID="3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.530759 4807 scope.go:117] "RemoveContainer" containerID="75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.543370 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fm"] Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.553479 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2m2fm"] Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.564006 4807 scope.go:117] "RemoveContainer" containerID="dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.595350 4807 scope.go:117] "RemoveContainer" containerID="3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c" Nov 27 11:36:29 crc kubenswrapper[4807]: E1127 11:36:29.595896 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c\": container with ID starting with 3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c not found: ID does not exist" containerID="3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.595927 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c"} err="failed to get container status \"3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c\": rpc error: code = NotFound desc = could not find container \"3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c\": container with ID starting with 3617d03272de3414c59bf84aec68c54d8518e413ebed4330c6f49a680577631c not found: ID does not exist" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.595950 4807 scope.go:117] "RemoveContainer" containerID="75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364" Nov 27 11:36:29 crc kubenswrapper[4807]: E1127 11:36:29.596212 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364\": container with ID starting with 75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364 not found: ID does not exist" containerID="75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.596234 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364"} err="failed to get container status \"75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364\": rpc error: code = NotFound desc = could not find container \"75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364\": container with ID starting with 75b9b82d0a7a9bb6941f41b6d5576a26aacf48b35000aa7e78d7cde195fda364 not found: ID does not exist" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.596264 4807 scope.go:117] "RemoveContainer" containerID="dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696" Nov 27 11:36:29 crc kubenswrapper[4807]: E1127 11:36:29.596479 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696\": container with ID starting with dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696 not found: ID does not exist" containerID="dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696" Nov 27 11:36:29 crc kubenswrapper[4807]: I1127 11:36:29.596510 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696"} err="failed to get container status \"dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696\": rpc error: code = NotFound desc = could not find container \"dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696\": container with ID starting with dd9757e6446bf313938d7ec9b8a054170933ab0ce3683921cfd04d3a9f555696 not found: ID does not exist" Nov 27 11:36:31 crc kubenswrapper[4807]: I1127 11:36:31.542261 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" path="/var/lib/kubelet/pods/b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2/volumes" Nov 27 11:36:33 crc kubenswrapper[4807]: I1127 11:36:33.934001 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6rjsz"] Nov 27 11:36:33 crc kubenswrapper[4807]: E1127 11:36:33.934577 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerName="extract-content" Nov 27 11:36:33 crc kubenswrapper[4807]: I1127 11:36:33.934595 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerName="extract-content" Nov 27 11:36:33 crc kubenswrapper[4807]: E1127 11:36:33.934629 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerName="extract-utilities" Nov 27 11:36:33 crc kubenswrapper[4807]: I1127 11:36:33.934637 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerName="extract-utilities" Nov 27 11:36:33 crc kubenswrapper[4807]: E1127 11:36:33.934647 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerName="registry-server" Nov 27 11:36:33 crc kubenswrapper[4807]: I1127 11:36:33.934655 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerName="registry-server" Nov 27 11:36:33 crc kubenswrapper[4807]: I1127 11:36:33.934937 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a70ec5-c4ff-405e-8004-a9b40dc7b0c2" containerName="registry-server" Nov 27 11:36:33 crc kubenswrapper[4807]: I1127 11:36:33.936442 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:33 crc kubenswrapper[4807]: I1127 11:36:33.948612 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rjsz"] Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.011396 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-utilities\") pod \"certified-operators-6rjsz\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.011530 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxwbq\" (UniqueName: \"kubernetes.io/projected/bde0aeaf-ee02-416a-836b-016d16034194-kube-api-access-vxwbq\") pod \"certified-operators-6rjsz\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.011577 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-catalog-content\") pod \"certified-operators-6rjsz\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.113488 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-utilities\") pod \"certified-operators-6rjsz\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.113633 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxwbq\" (UniqueName: \"kubernetes.io/projected/bde0aeaf-ee02-416a-836b-016d16034194-kube-api-access-vxwbq\") pod \"certified-operators-6rjsz\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.113664 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-catalog-content\") pod \"certified-operators-6rjsz\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.114131 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-catalog-content\") pod \"certified-operators-6rjsz\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.114130 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-utilities\") pod \"certified-operators-6rjsz\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.132862 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxwbq\" (UniqueName: \"kubernetes.io/projected/bde0aeaf-ee02-416a-836b-016d16034194-kube-api-access-vxwbq\") pod \"certified-operators-6rjsz\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.264034 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:34 crc kubenswrapper[4807]: I1127 11:36:34.763506 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rjsz"] Nov 27 11:36:35 crc kubenswrapper[4807]: I1127 11:36:35.564769 4807 generic.go:334] "Generic (PLEG): container finished" podID="bde0aeaf-ee02-416a-836b-016d16034194" containerID="8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775" exitCode=0 Nov 27 11:36:35 crc kubenswrapper[4807]: I1127 11:36:35.564806 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rjsz" event={"ID":"bde0aeaf-ee02-416a-836b-016d16034194","Type":"ContainerDied","Data":"8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775"} Nov 27 11:36:35 crc kubenswrapper[4807]: I1127 11:36:35.565113 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rjsz" event={"ID":"bde0aeaf-ee02-416a-836b-016d16034194","Type":"ContainerStarted","Data":"3657e4fc7e3c9efaf95bf2c3f9c36a25a95a1ffc8e60bae2b4415a41e6a8f92b"} Nov 27 11:36:37 crc kubenswrapper[4807]: I1127 11:36:37.582428 4807 generic.go:334] "Generic (PLEG): container finished" podID="bde0aeaf-ee02-416a-836b-016d16034194" containerID="bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5" exitCode=0 Nov 27 11:36:37 crc kubenswrapper[4807]: I1127 11:36:37.582588 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rjsz" event={"ID":"bde0aeaf-ee02-416a-836b-016d16034194","Type":"ContainerDied","Data":"bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5"} Nov 27 11:36:38 crc kubenswrapper[4807]: I1127 11:36:38.613830 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rjsz" event={"ID":"bde0aeaf-ee02-416a-836b-016d16034194","Type":"ContainerStarted","Data":"bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45"} Nov 27 11:36:38 crc kubenswrapper[4807]: I1127 11:36:38.645109 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6rjsz" podStartSLOduration=3.213531873 podStartE2EDuration="5.645088634s" podCreationTimestamp="2025-11-27 11:36:33 +0000 UTC" firstStartedPulling="2025-11-27 11:36:35.56611969 +0000 UTC m=+1636.665617888" lastFinishedPulling="2025-11-27 11:36:37.997676461 +0000 UTC m=+1639.097174649" observedRunningTime="2025-11-27 11:36:38.634720812 +0000 UTC m=+1639.734219010" watchObservedRunningTime="2025-11-27 11:36:38.645088634 +0000 UTC m=+1639.744586842" Nov 27 11:36:44 crc kubenswrapper[4807]: I1127 11:36:44.264227 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:44 crc kubenswrapper[4807]: I1127 11:36:44.264884 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:44 crc kubenswrapper[4807]: I1127 11:36:44.315763 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:44 crc kubenswrapper[4807]: I1127 11:36:44.703004 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:44 crc kubenswrapper[4807]: I1127 11:36:44.750620 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rjsz"] Nov 27 11:36:45 crc kubenswrapper[4807]: I1127 11:36:45.669834 4807 generic.go:334] "Generic (PLEG): container finished" podID="193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d" containerID="5b807763da91765a817a348ef0c96993602ad334ff3e6881c0db7db8c8ffd6db" exitCode=0 Nov 27 11:36:45 crc kubenswrapper[4807]: I1127 11:36:45.669951 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" event={"ID":"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d","Type":"ContainerDied","Data":"5b807763da91765a817a348ef0c96993602ad334ff3e6881c0db7db8c8ffd6db"} Nov 27 11:36:46 crc kubenswrapper[4807]: I1127 11:36:46.678140 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6rjsz" podUID="bde0aeaf-ee02-416a-836b-016d16034194" containerName="registry-server" containerID="cri-o://bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45" gracePeriod=2 Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.114340 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.120601 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.257489 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-catalog-content\") pod \"bde0aeaf-ee02-416a-836b-016d16034194\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.257613 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kr4v\" (UniqueName: \"kubernetes.io/projected/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-kube-api-access-5kr4v\") pod \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.257660 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxwbq\" (UniqueName: \"kubernetes.io/projected/bde0aeaf-ee02-416a-836b-016d16034194-kube-api-access-vxwbq\") pod \"bde0aeaf-ee02-416a-836b-016d16034194\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.257797 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-utilities\") pod \"bde0aeaf-ee02-416a-836b-016d16034194\" (UID: \"bde0aeaf-ee02-416a-836b-016d16034194\") " Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.257832 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-ssh-key\") pod \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.257946 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-inventory\") pod \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\" (UID: \"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d\") " Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.259134 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-utilities" (OuterVolumeSpecName: "utilities") pod "bde0aeaf-ee02-416a-836b-016d16034194" (UID: "bde0aeaf-ee02-416a-836b-016d16034194"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.263537 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-kube-api-access-5kr4v" (OuterVolumeSpecName: "kube-api-access-5kr4v") pod "193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d" (UID: "193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d"). InnerVolumeSpecName "kube-api-access-5kr4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.268272 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde0aeaf-ee02-416a-836b-016d16034194-kube-api-access-vxwbq" (OuterVolumeSpecName: "kube-api-access-vxwbq") pod "bde0aeaf-ee02-416a-836b-016d16034194" (UID: "bde0aeaf-ee02-416a-836b-016d16034194"). InnerVolumeSpecName "kube-api-access-vxwbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.288669 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-inventory" (OuterVolumeSpecName: "inventory") pod "193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d" (UID: "193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.290056 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d" (UID: "193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.306476 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bde0aeaf-ee02-416a-836b-016d16034194" (UID: "bde0aeaf-ee02-416a-836b-016d16034194"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.360061 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.360094 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.360107 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kr4v\" (UniqueName: \"kubernetes.io/projected/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-kube-api-access-5kr4v\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.360116 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxwbq\" (UniqueName: \"kubernetes.io/projected/bde0aeaf-ee02-416a-836b-016d16034194-kube-api-access-vxwbq\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.360124 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bde0aeaf-ee02-416a-836b-016d16034194-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.360132 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.688138 4807 generic.go:334] "Generic (PLEG): container finished" podID="bde0aeaf-ee02-416a-836b-016d16034194" containerID="bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45" exitCode=0 Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.688197 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rjsz" event={"ID":"bde0aeaf-ee02-416a-836b-016d16034194","Type":"ContainerDied","Data":"bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45"} Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.688279 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rjsz" event={"ID":"bde0aeaf-ee02-416a-836b-016d16034194","Type":"ContainerDied","Data":"3657e4fc7e3c9efaf95bf2c3f9c36a25a95a1ffc8e60bae2b4415a41e6a8f92b"} Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.688303 4807 scope.go:117] "RemoveContainer" containerID="bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.688354 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rjsz" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.692631 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" event={"ID":"193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d","Type":"ContainerDied","Data":"e00a4a00c997d4f46d17681981986d0c8f0786cba6e462b4b87cfb408ffc1c63"} Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.692659 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e00a4a00c997d4f46d17681981986d0c8f0786cba6e462b4b87cfb408ffc1c63" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.692702 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-64std" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.721652 4807 scope.go:117] "RemoveContainer" containerID="bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.738083 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rjsz"] Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.745649 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6rjsz"] Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.781721 4807 scope.go:117] "RemoveContainer" containerID="8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.794378 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g"] Nov 27 11:36:47 crc kubenswrapper[4807]: E1127 11:36:47.794892 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde0aeaf-ee02-416a-836b-016d16034194" containerName="extract-content" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.794924 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde0aeaf-ee02-416a-836b-016d16034194" containerName="extract-content" Nov 27 11:36:47 crc kubenswrapper[4807]: E1127 11:36:47.794951 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde0aeaf-ee02-416a-836b-016d16034194" containerName="registry-server" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.794959 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde0aeaf-ee02-416a-836b-016d16034194" containerName="registry-server" Nov 27 11:36:47 crc kubenswrapper[4807]: E1127 11:36:47.794978 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde0aeaf-ee02-416a-836b-016d16034194" containerName="extract-utilities" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.794987 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde0aeaf-ee02-416a-836b-016d16034194" containerName="extract-utilities" Nov 27 11:36:47 crc kubenswrapper[4807]: E1127 11:36:47.795028 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.795038 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.795277 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.795323 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde0aeaf-ee02-416a-836b-016d16034194" containerName="registry-server" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.796666 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.798965 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.799086 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.799118 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.800629 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.819410 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g"] Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.854396 4807 scope.go:117] "RemoveContainer" containerID="bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45" Nov 27 11:36:47 crc kubenswrapper[4807]: E1127 11:36:47.854934 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45\": container with ID starting with bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45 not found: ID does not exist" containerID="bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.854982 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45"} err="failed to get container status \"bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45\": rpc error: code = NotFound desc = could not find container \"bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45\": container with ID starting with bc532faf8f5ee3e34d77e2e14df07193f9daa26801ed6d0055a8406d0edc0f45 not found: ID does not exist" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.855009 4807 scope.go:117] "RemoveContainer" containerID="bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5" Nov 27 11:36:47 crc kubenswrapper[4807]: E1127 11:36:47.855489 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5\": container with ID starting with bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5 not found: ID does not exist" containerID="bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.855540 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5"} err="failed to get container status \"bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5\": rpc error: code = NotFound desc = could not find container \"bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5\": container with ID starting with bea23a4c5bb4f885ef1f98a4dfdb106f12186385c87e3b6e0b4eebd94667c8d5 not found: ID does not exist" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.855562 4807 scope.go:117] "RemoveContainer" containerID="8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775" Nov 27 11:36:47 crc kubenswrapper[4807]: E1127 11:36:47.855873 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775\": container with ID starting with 8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775 not found: ID does not exist" containerID="8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.855904 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775"} err="failed to get container status \"8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775\": rpc error: code = NotFound desc = could not find container \"8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775\": container with ID starting with 8252a9fefc62f0a80b95a13b303468be867e0e79e2e777103aa6462f5b4aa775 not found: ID does not exist" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.875008 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lt95g\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.875123 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lt95g\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.875502 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm6ck\" (UniqueName: \"kubernetes.io/projected/e971f91a-7313-4149-af78-554da58f81e1-kube-api-access-vm6ck\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lt95g\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.977663 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lt95g\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.978028 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lt95g\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.978191 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm6ck\" (UniqueName: \"kubernetes.io/projected/e971f91a-7313-4149-af78-554da58f81e1-kube-api-access-vm6ck\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lt95g\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.981595 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lt95g\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.984792 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lt95g\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:47 crc kubenswrapper[4807]: I1127 11:36:47.997530 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm6ck\" (UniqueName: \"kubernetes.io/projected/e971f91a-7313-4149-af78-554da58f81e1-kube-api-access-vm6ck\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-lt95g\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:48 crc kubenswrapper[4807]: I1127 11:36:48.194320 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:48 crc kubenswrapper[4807]: I1127 11:36:48.716077 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g"] Nov 27 11:36:49 crc kubenswrapper[4807]: I1127 11:36:49.548319 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde0aeaf-ee02-416a-836b-016d16034194" path="/var/lib/kubelet/pods/bde0aeaf-ee02-416a-836b-016d16034194/volumes" Nov 27 11:36:49 crc kubenswrapper[4807]: I1127 11:36:49.719446 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" event={"ID":"e971f91a-7313-4149-af78-554da58f81e1","Type":"ContainerStarted","Data":"e9cae5154c1c733acd8d8a93aba38e59da645d2e9485905e5ce4971511a66362"} Nov 27 11:36:50 crc kubenswrapper[4807]: I1127 11:36:50.035231 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-5w7sj"] Nov 27 11:36:50 crc kubenswrapper[4807]: I1127 11:36:50.043085 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-5w7sj"] Nov 27 11:36:50 crc kubenswrapper[4807]: I1127 11:36:50.728649 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" event={"ID":"e971f91a-7313-4149-af78-554da58f81e1","Type":"ContainerStarted","Data":"bb3d3ea895ba5fbc3787eeb3f49e9fa359c2a9462536ce7b48a68a9e0513432e"} Nov 27 11:36:50 crc kubenswrapper[4807]: I1127 11:36:50.751507 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" podStartSLOduration=2.807101303 podStartE2EDuration="3.751490195s" podCreationTimestamp="2025-11-27 11:36:47 +0000 UTC" firstStartedPulling="2025-11-27 11:36:48.718874208 +0000 UTC m=+1649.818372406" lastFinishedPulling="2025-11-27 11:36:49.6632631 +0000 UTC m=+1650.762761298" observedRunningTime="2025-11-27 11:36:50.745126888 +0000 UTC m=+1651.844625096" watchObservedRunningTime="2025-11-27 11:36:50.751490195 +0000 UTC m=+1651.850988393" Nov 27 11:36:50 crc kubenswrapper[4807]: I1127 11:36:50.921371 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:36:50 crc kubenswrapper[4807]: I1127 11:36:50.921428 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:36:51 crc kubenswrapper[4807]: I1127 11:36:51.553039 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda15506-8b54-4097-ba8e-55dabab3ede7" path="/var/lib/kubelet/pods/fda15506-8b54-4097-ba8e-55dabab3ede7/volumes" Nov 27 11:36:54 crc kubenswrapper[4807]: I1127 11:36:54.765043 4807 generic.go:334] "Generic (PLEG): container finished" podID="e971f91a-7313-4149-af78-554da58f81e1" containerID="bb3d3ea895ba5fbc3787eeb3f49e9fa359c2a9462536ce7b48a68a9e0513432e" exitCode=0 Nov 27 11:36:54 crc kubenswrapper[4807]: I1127 11:36:54.765571 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" event={"ID":"e971f91a-7313-4149-af78-554da58f81e1","Type":"ContainerDied","Data":"bb3d3ea895ba5fbc3787eeb3f49e9fa359c2a9462536ce7b48a68a9e0513432e"} Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.082076 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-sgwqm"] Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.090662 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-sgwqm"] Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.194472 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.358427 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm6ck\" (UniqueName: \"kubernetes.io/projected/e971f91a-7313-4149-af78-554da58f81e1-kube-api-access-vm6ck\") pod \"e971f91a-7313-4149-af78-554da58f81e1\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.358809 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-ssh-key\") pod \"e971f91a-7313-4149-af78-554da58f81e1\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.358920 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-inventory\") pod \"e971f91a-7313-4149-af78-554da58f81e1\" (UID: \"e971f91a-7313-4149-af78-554da58f81e1\") " Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.364441 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e971f91a-7313-4149-af78-554da58f81e1-kube-api-access-vm6ck" (OuterVolumeSpecName: "kube-api-access-vm6ck") pod "e971f91a-7313-4149-af78-554da58f81e1" (UID: "e971f91a-7313-4149-af78-554da58f81e1"). InnerVolumeSpecName "kube-api-access-vm6ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.385094 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e971f91a-7313-4149-af78-554da58f81e1" (UID: "e971f91a-7313-4149-af78-554da58f81e1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.407896 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-inventory" (OuterVolumeSpecName: "inventory") pod "e971f91a-7313-4149-af78-554da58f81e1" (UID: "e971f91a-7313-4149-af78-554da58f81e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.461757 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm6ck\" (UniqueName: \"kubernetes.io/projected/e971f91a-7313-4149-af78-554da58f81e1-kube-api-access-vm6ck\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.461820 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.461834 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e971f91a-7313-4149-af78-554da58f81e1-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.781715 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" event={"ID":"e971f91a-7313-4149-af78-554da58f81e1","Type":"ContainerDied","Data":"e9cae5154c1c733acd8d8a93aba38e59da645d2e9485905e5ce4971511a66362"} Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.781955 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9cae5154c1c733acd8d8a93aba38e59da645d2e9485905e5ce4971511a66362" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.781805 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-lt95g" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.867026 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6"] Nov 27 11:36:56 crc kubenswrapper[4807]: E1127 11:36:56.867473 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e971f91a-7313-4149-af78-554da58f81e1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.867494 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e971f91a-7313-4149-af78-554da58f81e1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.867716 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e971f91a-7313-4149-af78-554da58f81e1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.868440 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.880296 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.880418 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.880632 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.880704 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.882984 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6"] Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.970686 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xc7q6\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.970796 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5m9q\" (UniqueName: \"kubernetes.io/projected/b0775b68-e606-412d-a9b9-1f8eb98bbd63-kube-api-access-k5m9q\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xc7q6\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:56 crc kubenswrapper[4807]: I1127 11:36:56.970846 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xc7q6\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:57 crc kubenswrapper[4807]: I1127 11:36:57.072807 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xc7q6\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:57 crc kubenswrapper[4807]: I1127 11:36:57.072884 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5m9q\" (UniqueName: \"kubernetes.io/projected/b0775b68-e606-412d-a9b9-1f8eb98bbd63-kube-api-access-k5m9q\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xc7q6\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:57 crc kubenswrapper[4807]: I1127 11:36:57.072942 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xc7q6\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:57 crc kubenswrapper[4807]: I1127 11:36:57.077134 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xc7q6\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:57 crc kubenswrapper[4807]: I1127 11:36:57.077240 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xc7q6\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:57 crc kubenswrapper[4807]: I1127 11:36:57.092656 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5m9q\" (UniqueName: \"kubernetes.io/projected/b0775b68-e606-412d-a9b9-1f8eb98bbd63-kube-api-access-k5m9q\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-xc7q6\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:57 crc kubenswrapper[4807]: I1127 11:36:57.203410 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:36:57 crc kubenswrapper[4807]: I1127 11:36:57.553057 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f65bbc0-75a0-4294-9cf8-0023799a1fea" path="/var/lib/kubelet/pods/0f65bbc0-75a0-4294-9cf8-0023799a1fea/volumes" Nov 27 11:36:57 crc kubenswrapper[4807]: I1127 11:36:57.687145 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6"] Nov 27 11:36:57 crc kubenswrapper[4807]: I1127 11:36:57.791612 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" event={"ID":"b0775b68-e606-412d-a9b9-1f8eb98bbd63","Type":"ContainerStarted","Data":"009d7c11567208cfe31278ebbb1b378875a38bf19ce5c75393f9174932949545"} Nov 27 11:36:58 crc kubenswrapper[4807]: I1127 11:36:58.800743 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" event={"ID":"b0775b68-e606-412d-a9b9-1f8eb98bbd63","Type":"ContainerStarted","Data":"5c05461a8b9a9f6c2af770f04bdb7769d3db4f248814910b327f000fd2be99ee"} Nov 27 11:37:01 crc kubenswrapper[4807]: I1127 11:37:01.027423 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" podStartSLOduration=4.393218757 podStartE2EDuration="5.027402482s" podCreationTimestamp="2025-11-27 11:36:56 +0000 UTC" firstStartedPulling="2025-11-27 11:36:57.700573336 +0000 UTC m=+1658.800071534" lastFinishedPulling="2025-11-27 11:36:58.334757061 +0000 UTC m=+1659.434255259" observedRunningTime="2025-11-27 11:36:58.818785768 +0000 UTC m=+1659.918283966" watchObservedRunningTime="2025-11-27 11:37:01.027402482 +0000 UTC m=+1662.126900680" Nov 27 11:37:01 crc kubenswrapper[4807]: I1127 11:37:01.030934 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lqqgw"] Nov 27 11:37:01 crc kubenswrapper[4807]: I1127 11:37:01.040466 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lqqgw"] Nov 27 11:37:01 crc kubenswrapper[4807]: I1127 11:37:01.544045 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e03b856-d6ea-40a5-96db-10d788131661" path="/var/lib/kubelet/pods/3e03b856-d6ea-40a5-96db-10d788131661/volumes" Nov 27 11:37:14 crc kubenswrapper[4807]: I1127 11:37:14.031593 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-rbw47"] Nov 27 11:37:14 crc kubenswrapper[4807]: I1127 11:37:14.041918 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-whdsr"] Nov 27 11:37:14 crc kubenswrapper[4807]: I1127 11:37:14.051563 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-rbw47"] Nov 27 11:37:14 crc kubenswrapper[4807]: I1127 11:37:14.059831 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-whdsr"] Nov 27 11:37:15 crc kubenswrapper[4807]: I1127 11:37:15.543559 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b745997-2256-496c-acee-f804c263ec35" path="/var/lib/kubelet/pods/1b745997-2256-496c-acee-f804c263ec35/volumes" Nov 27 11:37:15 crc kubenswrapper[4807]: I1127 11:37:15.544479 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2df3b54-f71f-469f-92e5-8c1daeb90a45" path="/var/lib/kubelet/pods/b2df3b54-f71f-469f-92e5-8c1daeb90a45/volumes" Nov 27 11:37:20 crc kubenswrapper[4807]: I1127 11:37:20.921348 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:37:20 crc kubenswrapper[4807]: I1127 11:37:20.921738 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:37:20 crc kubenswrapper[4807]: I1127 11:37:20.921784 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:37:20 crc kubenswrapper[4807]: I1127 11:37:20.922715 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:37:20 crc kubenswrapper[4807]: I1127 11:37:20.922805 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" gracePeriod=600 Nov 27 11:37:21 crc kubenswrapper[4807]: E1127 11:37:21.053065 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:37:21 crc kubenswrapper[4807]: I1127 11:37:21.999521 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" exitCode=0 Nov 27 11:37:21 crc kubenswrapper[4807]: I1127 11:37:21.999571 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88"} Nov 27 11:37:22 crc kubenswrapper[4807]: I1127 11:37:21.999611 4807 scope.go:117] "RemoveContainer" containerID="c29d8c1f4aa1a4d993b85b54355b8275690d350f55ec0d06dbc91b3680f8a870" Nov 27 11:37:22 crc kubenswrapper[4807]: I1127 11:37:22.000223 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:37:22 crc kubenswrapper[4807]: E1127 11:37:22.000602 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:37:24 crc kubenswrapper[4807]: I1127 11:37:24.660488 4807 scope.go:117] "RemoveContainer" containerID="ddd100747950730a02526bbee588199cf983baa95869b5cbdcc83ac734a1ede8" Nov 27 11:37:24 crc kubenswrapper[4807]: I1127 11:37:24.693939 4807 scope.go:117] "RemoveContainer" containerID="94fbd060b6de61963187fbe3a10974c6b2c95045586e88ea94cef730b8529b67" Nov 27 11:37:24 crc kubenswrapper[4807]: I1127 11:37:24.757549 4807 scope.go:117] "RemoveContainer" containerID="372c20533929fa3e737438722668f2ec05f738c37da43875aa72d9d3ec74a23b" Nov 27 11:37:24 crc kubenswrapper[4807]: I1127 11:37:24.815294 4807 scope.go:117] "RemoveContainer" containerID="55e04a6c44d7553589965d4f48795e1b0b6153736bf978aec0361b403c94dee2" Nov 27 11:37:24 crc kubenswrapper[4807]: I1127 11:37:24.843824 4807 scope.go:117] "RemoveContainer" containerID="b2312dad39bfa6284f68ec8d4106417fddde6821195f09606a51db2d99c9906b" Nov 27 11:37:33 crc kubenswrapper[4807]: I1127 11:37:33.100104 4807 generic.go:334] "Generic (PLEG): container finished" podID="b0775b68-e606-412d-a9b9-1f8eb98bbd63" containerID="5c05461a8b9a9f6c2af770f04bdb7769d3db4f248814910b327f000fd2be99ee" exitCode=0 Nov 27 11:37:33 crc kubenswrapper[4807]: I1127 11:37:33.100229 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" event={"ID":"b0775b68-e606-412d-a9b9-1f8eb98bbd63","Type":"ContainerDied","Data":"5c05461a8b9a9f6c2af770f04bdb7769d3db4f248814910b327f000fd2be99ee"} Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.532560 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:37:34 crc kubenswrapper[4807]: E1127 11:37:34.533195 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.537722 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.620088 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-ssh-key\") pod \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.620179 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5m9q\" (UniqueName: \"kubernetes.io/projected/b0775b68-e606-412d-a9b9-1f8eb98bbd63-kube-api-access-k5m9q\") pod \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.620360 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-inventory\") pod \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\" (UID: \"b0775b68-e606-412d-a9b9-1f8eb98bbd63\") " Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.625690 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0775b68-e606-412d-a9b9-1f8eb98bbd63-kube-api-access-k5m9q" (OuterVolumeSpecName: "kube-api-access-k5m9q") pod "b0775b68-e606-412d-a9b9-1f8eb98bbd63" (UID: "b0775b68-e606-412d-a9b9-1f8eb98bbd63"). InnerVolumeSpecName "kube-api-access-k5m9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.647762 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-inventory" (OuterVolumeSpecName: "inventory") pod "b0775b68-e606-412d-a9b9-1f8eb98bbd63" (UID: "b0775b68-e606-412d-a9b9-1f8eb98bbd63"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.653465 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b0775b68-e606-412d-a9b9-1f8eb98bbd63" (UID: "b0775b68-e606-412d-a9b9-1f8eb98bbd63"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.722236 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.722286 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5m9q\" (UniqueName: \"kubernetes.io/projected/b0775b68-e606-412d-a9b9-1f8eb98bbd63-kube-api-access-k5m9q\") on node \"crc\" DevicePath \"\"" Nov 27 11:37:34 crc kubenswrapper[4807]: I1127 11:37:34.722298 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0775b68-e606-412d-a9b9-1f8eb98bbd63-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.121746 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" event={"ID":"b0775b68-e606-412d-a9b9-1f8eb98bbd63","Type":"ContainerDied","Data":"009d7c11567208cfe31278ebbb1b378875a38bf19ce5c75393f9174932949545"} Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.122165 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="009d7c11567208cfe31278ebbb1b378875a38bf19ce5c75393f9174932949545" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.121809 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-xc7q6" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.202721 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw"] Nov 27 11:37:35 crc kubenswrapper[4807]: E1127 11:37:35.203095 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0775b68-e606-412d-a9b9-1f8eb98bbd63" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.203114 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0775b68-e606-412d-a9b9-1f8eb98bbd63" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.203307 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0775b68-e606-412d-a9b9-1f8eb98bbd63" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.203871 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.205809 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.206090 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.206288 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.206471 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.247481 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw"] Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.336684 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.336734 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c98z\" (UniqueName: \"kubernetes.io/projected/369043db-4f00-4bbd-ab16-6d8f27564af2-kube-api-access-4c98z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.336901 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.438837 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.438879 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c98z\" (UniqueName: \"kubernetes.io/projected/369043db-4f00-4bbd-ab16-6d8f27564af2-kube-api-access-4c98z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.438929 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.443969 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.450439 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.472627 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c98z\" (UniqueName: \"kubernetes.io/projected/369043db-4f00-4bbd-ab16-6d8f27564af2-kube-api-access-4c98z\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:35 crc kubenswrapper[4807]: I1127 11:37:35.533969 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:37:36 crc kubenswrapper[4807]: I1127 11:37:36.011748 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw"] Nov 27 11:37:36 crc kubenswrapper[4807]: W1127 11:37:36.015355 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod369043db_4f00_4bbd_ab16_6d8f27564af2.slice/crio-e886c5ab15e466310f4b2bfcefac129438f5cbd4ffab38d67932a7d9d114a7fb WatchSource:0}: Error finding container e886c5ab15e466310f4b2bfcefac129438f5cbd4ffab38d67932a7d9d114a7fb: Status 404 returned error can't find the container with id e886c5ab15e466310f4b2bfcefac129438f5cbd4ffab38d67932a7d9d114a7fb Nov 27 11:37:36 crc kubenswrapper[4807]: I1127 11:37:36.129605 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" event={"ID":"369043db-4f00-4bbd-ab16-6d8f27564af2","Type":"ContainerStarted","Data":"e886c5ab15e466310f4b2bfcefac129438f5cbd4ffab38d67932a7d9d114a7fb"} Nov 27 11:37:37 crc kubenswrapper[4807]: I1127 11:37:37.151537 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" event={"ID":"369043db-4f00-4bbd-ab16-6d8f27564af2","Type":"ContainerStarted","Data":"953d55f2d41f1847c19ab20065445355f726fd989f52abb294049a0fd87b9645"} Nov 27 11:37:37 crc kubenswrapper[4807]: I1127 11:37:37.172742 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" podStartSLOduration=1.557870066 podStartE2EDuration="2.172724465s" podCreationTimestamp="2025-11-27 11:37:35 +0000 UTC" firstStartedPulling="2025-11-27 11:37:36.018095877 +0000 UTC m=+1697.117594075" lastFinishedPulling="2025-11-27 11:37:36.632950256 +0000 UTC m=+1697.732448474" observedRunningTime="2025-11-27 11:37:37.168955486 +0000 UTC m=+1698.268453704" watchObservedRunningTime="2025-11-27 11:37:37.172724465 +0000 UTC m=+1698.272222663" Nov 27 11:37:45 crc kubenswrapper[4807]: I1127 11:37:45.532639 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:37:45 crc kubenswrapper[4807]: E1127 11:37:45.533543 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:37:56 crc kubenswrapper[4807]: I1127 11:37:56.049955 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-gw4hg"] Nov 27 11:37:56 crc kubenswrapper[4807]: I1127 11:37:56.063720 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-gw4hg"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.034762 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-a42a-account-create-update-254qb"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.042950 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7aae-account-create-update-5lvsf"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.052918 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-7t4pz"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.060345 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-pqzjj"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.066978 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6113-account-create-update-xmtrf"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.073586 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-a42a-account-create-update-254qb"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.080933 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7aae-account-create-update-5lvsf"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.087531 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-7t4pz"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.094049 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-pqzjj"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.100406 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6113-account-create-update-xmtrf"] Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.549327 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f3c060-bf19-4af4-be26-521942d50da4" path="/var/lib/kubelet/pods/25f3c060-bf19-4af4-be26-521942d50da4/volumes" Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.550115 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c46baf-4492-41d3-a556-38709abf8e0c" path="/var/lib/kubelet/pods/67c46baf-4492-41d3-a556-38709abf8e0c/volumes" Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.550686 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ba3ef20-046c-4460-8533-132da27f4e06" path="/var/lib/kubelet/pods/7ba3ef20-046c-4460-8533-132da27f4e06/volumes" Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.551173 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="817319b3-2356-48a2-9922-79cfb7b00623" path="/var/lib/kubelet/pods/817319b3-2356-48a2-9922-79cfb7b00623/volumes" Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.552161 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6cc86c-1844-4444-85a9-370f5bc090d2" path="/var/lib/kubelet/pods/8c6cc86c-1844-4444-85a9-370f5bc090d2/volumes" Nov 27 11:37:57 crc kubenswrapper[4807]: I1127 11:37:57.552740 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efa0aaff-659e-4816-9cba-298325fbc28c" path="/var/lib/kubelet/pods/efa0aaff-659e-4816-9cba-298325fbc28c/volumes" Nov 27 11:37:58 crc kubenswrapper[4807]: I1127 11:37:58.533106 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:37:58 crc kubenswrapper[4807]: E1127 11:37:58.533637 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:38:13 crc kubenswrapper[4807]: I1127 11:38:13.532690 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:38:13 crc kubenswrapper[4807]: E1127 11:38:13.533652 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:38:18 crc kubenswrapper[4807]: I1127 11:38:18.027529 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cwjzz"] Nov 27 11:38:18 crc kubenswrapper[4807]: I1127 11:38:18.035194 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cwjzz"] Nov 27 11:38:19 crc kubenswrapper[4807]: I1127 11:38:19.544035 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5650a6a6-37ed-42ec-812a-f94eb9a92117" path="/var/lib/kubelet/pods/5650a6a6-37ed-42ec-812a-f94eb9a92117/volumes" Nov 27 11:38:24 crc kubenswrapper[4807]: I1127 11:38:24.558906 4807 generic.go:334] "Generic (PLEG): container finished" podID="369043db-4f00-4bbd-ab16-6d8f27564af2" containerID="953d55f2d41f1847c19ab20065445355f726fd989f52abb294049a0fd87b9645" exitCode=0 Nov 27 11:38:24 crc kubenswrapper[4807]: I1127 11:38:24.558963 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" event={"ID":"369043db-4f00-4bbd-ab16-6d8f27564af2","Type":"ContainerDied","Data":"953d55f2d41f1847c19ab20065445355f726fd989f52abb294049a0fd87b9645"} Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.006640 4807 scope.go:117] "RemoveContainer" containerID="9c6ea88d3a58a15196122eb8ac4f99eacb5fb9228351bc90ed64c5134a32a727" Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.040698 4807 scope.go:117] "RemoveContainer" containerID="12409865f31fae447430df480013e4668bed3163998da7f70a270ef1e1b2799a" Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.081390 4807 scope.go:117] "RemoveContainer" containerID="90ff2d035d4aa231be37e6f060e39aeb77c62c088c7003f66bcc5e904da8b60b" Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.138129 4807 scope.go:117] "RemoveContainer" containerID="816e58a5835552db4016c0195edacf188db0a1379757b14a48c358cb20ac493b" Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.167994 4807 scope.go:117] "RemoveContainer" containerID="c0a0bf8dc8342630fc726a7a6966cac36479dcac76aeff10b6f25051ef5af7ae" Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.211183 4807 scope.go:117] "RemoveContainer" containerID="c44ef97808e4e5abd0e62650df1fe0e599294a67bd96897898931542cf97e03d" Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.253380 4807 scope.go:117] "RemoveContainer" containerID="5e50bcfadd380103be68eff6a81c37b3c4e3a0a6f66a1f7b471470bfaaba0b04" Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.868195 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.904209 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c98z\" (UniqueName: \"kubernetes.io/projected/369043db-4f00-4bbd-ab16-6d8f27564af2-kube-api-access-4c98z\") pod \"369043db-4f00-4bbd-ab16-6d8f27564af2\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.904548 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-inventory\") pod \"369043db-4f00-4bbd-ab16-6d8f27564af2\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.904616 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-ssh-key\") pod \"369043db-4f00-4bbd-ab16-6d8f27564af2\" (UID: \"369043db-4f00-4bbd-ab16-6d8f27564af2\") " Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.910110 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369043db-4f00-4bbd-ab16-6d8f27564af2-kube-api-access-4c98z" (OuterVolumeSpecName: "kube-api-access-4c98z") pod "369043db-4f00-4bbd-ab16-6d8f27564af2" (UID: "369043db-4f00-4bbd-ab16-6d8f27564af2"). InnerVolumeSpecName "kube-api-access-4c98z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.934419 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "369043db-4f00-4bbd-ab16-6d8f27564af2" (UID: "369043db-4f00-4bbd-ab16-6d8f27564af2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:38:25 crc kubenswrapper[4807]: I1127 11:38:25.939819 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-inventory" (OuterVolumeSpecName: "inventory") pod "369043db-4f00-4bbd-ab16-6d8f27564af2" (UID: "369043db-4f00-4bbd-ab16-6d8f27564af2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.006633 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.006667 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/369043db-4f00-4bbd-ab16-6d8f27564af2-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.006682 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c98z\" (UniqueName: \"kubernetes.io/projected/369043db-4f00-4bbd-ab16-6d8f27564af2-kube-api-access-4c98z\") on node \"crc\" DevicePath \"\"" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.533634 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:38:26 crc kubenswrapper[4807]: E1127 11:38:26.534428 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.576997 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" event={"ID":"369043db-4f00-4bbd-ab16-6d8f27564af2","Type":"ContainerDied","Data":"e886c5ab15e466310f4b2bfcefac129438f5cbd4ffab38d67932a7d9d114a7fb"} Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.577043 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e886c5ab15e466310f4b2bfcefac129438f5cbd4ffab38d67932a7d9d114a7fb" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.577110 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.647943 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8578"] Nov 27 11:38:26 crc kubenswrapper[4807]: E1127 11:38:26.648404 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369043db-4f00-4bbd-ab16-6d8f27564af2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.648428 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="369043db-4f00-4bbd-ab16-6d8f27564af2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.648680 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="369043db-4f00-4bbd-ab16-6d8f27564af2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.649523 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.651783 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.651827 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.652110 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.653738 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.663762 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8578"] Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.719005 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8578\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.719091 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8578\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.719124 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb5pc\" (UniqueName: \"kubernetes.io/projected/f628332e-750f-45bd-994e-fcd01490e1e5-kube-api-access-fb5pc\") pod \"ssh-known-hosts-edpm-deployment-x8578\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.820573 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8578\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.820876 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb5pc\" (UniqueName: \"kubernetes.io/projected/f628332e-750f-45bd-994e-fcd01490e1e5-kube-api-access-fb5pc\") pod \"ssh-known-hosts-edpm-deployment-x8578\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.821040 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8578\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.826314 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8578\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.826575 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8578\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.844132 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb5pc\" (UniqueName: \"kubernetes.io/projected/f628332e-750f-45bd-994e-fcd01490e1e5-kube-api-access-fb5pc\") pod \"ssh-known-hosts-edpm-deployment-x8578\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:26 crc kubenswrapper[4807]: I1127 11:38:26.965507 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:27 crc kubenswrapper[4807]: I1127 11:38:27.493870 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8578"] Nov 27 11:38:27 crc kubenswrapper[4807]: W1127 11:38:27.496694 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf628332e_750f_45bd_994e_fcd01490e1e5.slice/crio-5880ca9f39e6f9a8f2d96451dbd09fcbd73a6d86c57c3f1782344e4505a50a64 WatchSource:0}: Error finding container 5880ca9f39e6f9a8f2d96451dbd09fcbd73a6d86c57c3f1782344e4505a50a64: Status 404 returned error can't find the container with id 5880ca9f39e6f9a8f2d96451dbd09fcbd73a6d86c57c3f1782344e4505a50a64 Nov 27 11:38:27 crc kubenswrapper[4807]: I1127 11:38:27.586649 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8578" event={"ID":"f628332e-750f-45bd-994e-fcd01490e1e5","Type":"ContainerStarted","Data":"5880ca9f39e6f9a8f2d96451dbd09fcbd73a6d86c57c3f1782344e4505a50a64"} Nov 27 11:38:28 crc kubenswrapper[4807]: I1127 11:38:28.595729 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8578" event={"ID":"f628332e-750f-45bd-994e-fcd01490e1e5","Type":"ContainerStarted","Data":"8d6251f428036f1ef32cca0d288839b5d3ca58f83faa1459e221665be018037e"} Nov 27 11:38:28 crc kubenswrapper[4807]: I1127 11:38:28.618286 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-x8578" podStartSLOduration=2.080875867 podStartE2EDuration="2.618268665s" podCreationTimestamp="2025-11-27 11:38:26 +0000 UTC" firstStartedPulling="2025-11-27 11:38:27.498725715 +0000 UTC m=+1748.598223913" lastFinishedPulling="2025-11-27 11:38:28.036118513 +0000 UTC m=+1749.135616711" observedRunningTime="2025-11-27 11:38:28.609986616 +0000 UTC m=+1749.709484804" watchObservedRunningTime="2025-11-27 11:38:28.618268665 +0000 UTC m=+1749.717766863" Nov 27 11:38:34 crc kubenswrapper[4807]: I1127 11:38:34.648211 4807 generic.go:334] "Generic (PLEG): container finished" podID="f628332e-750f-45bd-994e-fcd01490e1e5" containerID="8d6251f428036f1ef32cca0d288839b5d3ca58f83faa1459e221665be018037e" exitCode=0 Nov 27 11:38:34 crc kubenswrapper[4807]: I1127 11:38:34.648295 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8578" event={"ID":"f628332e-750f-45bd-994e-fcd01490e1e5","Type":"ContainerDied","Data":"8d6251f428036f1ef32cca0d288839b5d3ca58f83faa1459e221665be018037e"} Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.036629 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.093161 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-ssh-key-openstack-edpm-ipam\") pod \"f628332e-750f-45bd-994e-fcd01490e1e5\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.093359 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb5pc\" (UniqueName: \"kubernetes.io/projected/f628332e-750f-45bd-994e-fcd01490e1e5-kube-api-access-fb5pc\") pod \"f628332e-750f-45bd-994e-fcd01490e1e5\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.093426 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-inventory-0\") pod \"f628332e-750f-45bd-994e-fcd01490e1e5\" (UID: \"f628332e-750f-45bd-994e-fcd01490e1e5\") " Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.098229 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f628332e-750f-45bd-994e-fcd01490e1e5-kube-api-access-fb5pc" (OuterVolumeSpecName: "kube-api-access-fb5pc") pod "f628332e-750f-45bd-994e-fcd01490e1e5" (UID: "f628332e-750f-45bd-994e-fcd01490e1e5"). InnerVolumeSpecName "kube-api-access-fb5pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.122820 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f628332e-750f-45bd-994e-fcd01490e1e5" (UID: "f628332e-750f-45bd-994e-fcd01490e1e5"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.123517 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f628332e-750f-45bd-994e-fcd01490e1e5" (UID: "f628332e-750f-45bd-994e-fcd01490e1e5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.194865 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb5pc\" (UniqueName: \"kubernetes.io/projected/f628332e-750f-45bd-994e-fcd01490e1e5-kube-api-access-fb5pc\") on node \"crc\" DevicePath \"\"" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.194904 4807 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.194914 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f628332e-750f-45bd-994e-fcd01490e1e5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.667087 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8578" event={"ID":"f628332e-750f-45bd-994e-fcd01490e1e5","Type":"ContainerDied","Data":"5880ca9f39e6f9a8f2d96451dbd09fcbd73a6d86c57c3f1782344e4505a50a64"} Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.667452 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5880ca9f39e6f9a8f2d96451dbd09fcbd73a6d86c57c3f1782344e4505a50a64" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.667160 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8578" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.771048 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67"] Nov 27 11:38:36 crc kubenswrapper[4807]: E1127 11:38:36.771948 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f628332e-750f-45bd-994e-fcd01490e1e5" containerName="ssh-known-hosts-edpm-deployment" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.771970 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="f628332e-750f-45bd-994e-fcd01490e1e5" containerName="ssh-known-hosts-edpm-deployment" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.772530 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="f628332e-750f-45bd-994e-fcd01490e1e5" containerName="ssh-known-hosts-edpm-deployment" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.773651 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.777778 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.778791 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.779057 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.779393 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.811137 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hbp6\" (UniqueName: \"kubernetes.io/projected/345077b8-ac19-43cb-8eee-e6112034320c-kube-api-access-5hbp6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bdw67\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.811281 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bdw67\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.811322 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bdw67\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.817326 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67"] Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.913224 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bdw67\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.913497 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bdw67\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.913655 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hbp6\" (UniqueName: \"kubernetes.io/projected/345077b8-ac19-43cb-8eee-e6112034320c-kube-api-access-5hbp6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bdw67\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.917404 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bdw67\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.925988 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bdw67\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:36 crc kubenswrapper[4807]: I1127 11:38:36.931691 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hbp6\" (UniqueName: \"kubernetes.io/projected/345077b8-ac19-43cb-8eee-e6112034320c-kube-api-access-5hbp6\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-bdw67\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:37 crc kubenswrapper[4807]: I1127 11:38:37.114322 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:37 crc kubenswrapper[4807]: I1127 11:38:37.532998 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:38:37 crc kubenswrapper[4807]: E1127 11:38:37.540070 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:38:37 crc kubenswrapper[4807]: I1127 11:38:37.636404 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67"] Nov 27 11:38:37 crc kubenswrapper[4807]: I1127 11:38:37.682920 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" event={"ID":"345077b8-ac19-43cb-8eee-e6112034320c","Type":"ContainerStarted","Data":"caf6fbbd4ee712fadd95ff43b2014898ea64a0271fd974413657e799a35cb0ea"} Nov 27 11:38:38 crc kubenswrapper[4807]: I1127 11:38:38.692094 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" event={"ID":"345077b8-ac19-43cb-8eee-e6112034320c","Type":"ContainerStarted","Data":"800b831ad318b76a98921db627e13411e282dfd068ee842403626150920ab910"} Nov 27 11:38:38 crc kubenswrapper[4807]: I1127 11:38:38.706399 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" podStartSLOduration=2.110003453 podStartE2EDuration="2.706379194s" podCreationTimestamp="2025-11-27 11:38:36 +0000 UTC" firstStartedPulling="2025-11-27 11:38:37.639361376 +0000 UTC m=+1758.738859574" lastFinishedPulling="2025-11-27 11:38:38.235737117 +0000 UTC m=+1759.335235315" observedRunningTime="2025-11-27 11:38:38.705177022 +0000 UTC m=+1759.804675220" watchObservedRunningTime="2025-11-27 11:38:38.706379194 +0000 UTC m=+1759.805877392" Nov 27 11:38:40 crc kubenswrapper[4807]: I1127 11:38:40.045494 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-mrj5n"] Nov 27 11:38:40 crc kubenswrapper[4807]: I1127 11:38:40.058643 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mznhj"] Nov 27 11:38:40 crc kubenswrapper[4807]: I1127 11:38:40.069730 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mznhj"] Nov 27 11:38:40 crc kubenswrapper[4807]: I1127 11:38:40.078614 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-mrj5n"] Nov 27 11:38:41 crc kubenswrapper[4807]: I1127 11:38:41.542088 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6dc415-3063-48ab-8a84-27a041b110f4" path="/var/lib/kubelet/pods/9f6dc415-3063-48ab-8a84-27a041b110f4/volumes" Nov 27 11:38:41 crc kubenswrapper[4807]: I1127 11:38:41.543091 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d88895da-e0b3-40c7-82bf-eb68882e01cd" path="/var/lib/kubelet/pods/d88895da-e0b3-40c7-82bf-eb68882e01cd/volumes" Nov 27 11:38:46 crc kubenswrapper[4807]: I1127 11:38:46.758422 4807 generic.go:334] "Generic (PLEG): container finished" podID="345077b8-ac19-43cb-8eee-e6112034320c" containerID="800b831ad318b76a98921db627e13411e282dfd068ee842403626150920ab910" exitCode=0 Nov 27 11:38:46 crc kubenswrapper[4807]: I1127 11:38:46.758498 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" event={"ID":"345077b8-ac19-43cb-8eee-e6112034320c","Type":"ContainerDied","Data":"800b831ad318b76a98921db627e13411e282dfd068ee842403626150920ab910"} Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.187837 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.326420 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hbp6\" (UniqueName: \"kubernetes.io/projected/345077b8-ac19-43cb-8eee-e6112034320c-kube-api-access-5hbp6\") pod \"345077b8-ac19-43cb-8eee-e6112034320c\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.326544 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-ssh-key\") pod \"345077b8-ac19-43cb-8eee-e6112034320c\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.326685 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-inventory\") pod \"345077b8-ac19-43cb-8eee-e6112034320c\" (UID: \"345077b8-ac19-43cb-8eee-e6112034320c\") " Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.331664 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/345077b8-ac19-43cb-8eee-e6112034320c-kube-api-access-5hbp6" (OuterVolumeSpecName: "kube-api-access-5hbp6") pod "345077b8-ac19-43cb-8eee-e6112034320c" (UID: "345077b8-ac19-43cb-8eee-e6112034320c"). InnerVolumeSpecName "kube-api-access-5hbp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.351638 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "345077b8-ac19-43cb-8eee-e6112034320c" (UID: "345077b8-ac19-43cb-8eee-e6112034320c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.354191 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-inventory" (OuterVolumeSpecName: "inventory") pod "345077b8-ac19-43cb-8eee-e6112034320c" (UID: "345077b8-ac19-43cb-8eee-e6112034320c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.429368 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.429401 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hbp6\" (UniqueName: \"kubernetes.io/projected/345077b8-ac19-43cb-8eee-e6112034320c-kube-api-access-5hbp6\") on node \"crc\" DevicePath \"\"" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.429412 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/345077b8-ac19-43cb-8eee-e6112034320c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.780362 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" event={"ID":"345077b8-ac19-43cb-8eee-e6112034320c","Type":"ContainerDied","Data":"caf6fbbd4ee712fadd95ff43b2014898ea64a0271fd974413657e799a35cb0ea"} Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.780675 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caf6fbbd4ee712fadd95ff43b2014898ea64a0271fd974413657e799a35cb0ea" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.780443 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-bdw67" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.862217 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb"] Nov 27 11:38:48 crc kubenswrapper[4807]: E1127 11:38:48.862668 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="345077b8-ac19-43cb-8eee-e6112034320c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.862689 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="345077b8-ac19-43cb-8eee-e6112034320c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.862924 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="345077b8-ac19-43cb-8eee-e6112034320c" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.863736 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.865603 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.865609 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.865897 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.867668 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:38:48 crc kubenswrapper[4807]: I1127 11:38:48.874456 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb"] Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.040601 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxks5\" (UniqueName: \"kubernetes.io/projected/cff69888-3585-4127-a2e6-122a7fdfe894-kube-api-access-mxks5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.040655 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.040743 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.142800 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.143090 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxks5\" (UniqueName: \"kubernetes.io/projected/cff69888-3585-4127-a2e6-122a7fdfe894-kube-api-access-mxks5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.143155 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.148101 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.151427 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.165967 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxks5\" (UniqueName: \"kubernetes.io/projected/cff69888-3585-4127-a2e6-122a7fdfe894-kube-api-access-mxks5\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.179283 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.521877 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb"] Nov 27 11:38:49 crc kubenswrapper[4807]: I1127 11:38:49.791538 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" event={"ID":"cff69888-3585-4127-a2e6-122a7fdfe894","Type":"ContainerStarted","Data":"a67e5e73399360b7a572070faa0b2d9493b9cfb45d9cf5f13b363848f9c20bf3"} Nov 27 11:38:50 crc kubenswrapper[4807]: I1127 11:38:50.532511 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:38:50 crc kubenswrapper[4807]: E1127 11:38:50.533081 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:38:50 crc kubenswrapper[4807]: I1127 11:38:50.800322 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" event={"ID":"cff69888-3585-4127-a2e6-122a7fdfe894","Type":"ContainerStarted","Data":"39e79ecc8828847bd55789975d8fe68445dde9f222b24add50f20d0356a5d799"} Nov 27 11:38:50 crc kubenswrapper[4807]: I1127 11:38:50.824235 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" podStartSLOduration=2.338400393 podStartE2EDuration="2.824214942s" podCreationTimestamp="2025-11-27 11:38:48 +0000 UTC" firstStartedPulling="2025-11-27 11:38:49.529103797 +0000 UTC m=+1770.628601995" lastFinishedPulling="2025-11-27 11:38:50.014918326 +0000 UTC m=+1771.114416544" observedRunningTime="2025-11-27 11:38:50.817622327 +0000 UTC m=+1771.917120525" watchObservedRunningTime="2025-11-27 11:38:50.824214942 +0000 UTC m=+1771.923713140" Nov 27 11:38:59 crc kubenswrapper[4807]: I1127 11:38:59.878767 4807 generic.go:334] "Generic (PLEG): container finished" podID="cff69888-3585-4127-a2e6-122a7fdfe894" containerID="39e79ecc8828847bd55789975d8fe68445dde9f222b24add50f20d0356a5d799" exitCode=0 Nov 27 11:38:59 crc kubenswrapper[4807]: I1127 11:38:59.878887 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" event={"ID":"cff69888-3585-4127-a2e6-122a7fdfe894","Type":"ContainerDied","Data":"39e79ecc8828847bd55789975d8fe68445dde9f222b24add50f20d0356a5d799"} Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.360547 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.458058 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxks5\" (UniqueName: \"kubernetes.io/projected/cff69888-3585-4127-a2e6-122a7fdfe894-kube-api-access-mxks5\") pod \"cff69888-3585-4127-a2e6-122a7fdfe894\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.458110 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-ssh-key\") pod \"cff69888-3585-4127-a2e6-122a7fdfe894\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.458144 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-inventory\") pod \"cff69888-3585-4127-a2e6-122a7fdfe894\" (UID: \"cff69888-3585-4127-a2e6-122a7fdfe894\") " Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.464333 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cff69888-3585-4127-a2e6-122a7fdfe894-kube-api-access-mxks5" (OuterVolumeSpecName: "kube-api-access-mxks5") pod "cff69888-3585-4127-a2e6-122a7fdfe894" (UID: "cff69888-3585-4127-a2e6-122a7fdfe894"). InnerVolumeSpecName "kube-api-access-mxks5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.485209 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "cff69888-3585-4127-a2e6-122a7fdfe894" (UID: "cff69888-3585-4127-a2e6-122a7fdfe894"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.486915 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-inventory" (OuterVolumeSpecName: "inventory") pod "cff69888-3585-4127-a2e6-122a7fdfe894" (UID: "cff69888-3585-4127-a2e6-122a7fdfe894"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.560896 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxks5\" (UniqueName: \"kubernetes.io/projected/cff69888-3585-4127-a2e6-122a7fdfe894-kube-api-access-mxks5\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.560936 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.560971 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cff69888-3585-4127-a2e6-122a7fdfe894-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.905981 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" event={"ID":"cff69888-3585-4127-a2e6-122a7fdfe894","Type":"ContainerDied","Data":"a67e5e73399360b7a572070faa0b2d9493b9cfb45d9cf5f13b363848f9c20bf3"} Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.906028 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a67e5e73399360b7a572070faa0b2d9493b9cfb45d9cf5f13b363848f9c20bf3" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.906119 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.994100 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2"] Nov 27 11:39:01 crc kubenswrapper[4807]: E1127 11:39:01.994924 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff69888-3585-4127-a2e6-122a7fdfe894" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.994944 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff69888-3585-4127-a2e6-122a7fdfe894" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.995140 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff69888-3585-4127-a2e6-122a7fdfe894" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.996035 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.998859 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.999058 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Nov 27 11:39:01 crc kubenswrapper[4807]: I1127 11:39:01.999092 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.000632 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.000968 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.001132 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.001314 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.001465 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.005920 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2"] Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182196 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182278 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182301 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182325 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182349 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8854\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-kube-api-access-z8854\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182471 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182539 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182608 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182671 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182710 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182755 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182772 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182797 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.182819 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.284488 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.284747 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.284870 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.284986 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.285079 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.285194 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.285342 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.285440 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.285556 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.285706 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8854\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-kube-api-access-z8854\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.286142 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.286336 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.286502 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.286641 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.289632 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.289686 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.290149 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.290475 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.291038 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.291956 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.293167 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.293765 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.295424 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.295571 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.299562 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.296100 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.306559 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8854\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-kube-api-access-z8854\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.307494 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.326615 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.898076 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2"] Nov 27 11:39:02 crc kubenswrapper[4807]: I1127 11:39:02.918605 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" event={"ID":"e50793ea-c215-407b-ac8f-a5767166a0dd","Type":"ContainerStarted","Data":"33f85f8e363529fac01261a7f5e696c3c1f8a4c693ed24af529972820a8eee01"} Nov 27 11:39:03 crc kubenswrapper[4807]: I1127 11:39:03.934612 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" event={"ID":"e50793ea-c215-407b-ac8f-a5767166a0dd","Type":"ContainerStarted","Data":"bf77d0e970ff53733ce70213e67fe63416fa7e5db636fa0ff6a2fcd3e67cb5b8"} Nov 27 11:39:03 crc kubenswrapper[4807]: I1127 11:39:03.972603 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" podStartSLOduration=2.274105125 podStartE2EDuration="2.972582903s" podCreationTimestamp="2025-11-27 11:39:01 +0000 UTC" firstStartedPulling="2025-11-27 11:39:02.9016164 +0000 UTC m=+1784.001114618" lastFinishedPulling="2025-11-27 11:39:03.600094158 +0000 UTC m=+1784.699592396" observedRunningTime="2025-11-27 11:39:03.954968776 +0000 UTC m=+1785.054467014" watchObservedRunningTime="2025-11-27 11:39:03.972582903 +0000 UTC m=+1785.072081101" Nov 27 11:39:05 crc kubenswrapper[4807]: I1127 11:39:05.532873 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:39:05 crc kubenswrapper[4807]: E1127 11:39:05.533614 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:39:17 crc kubenswrapper[4807]: I1127 11:39:17.534157 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:39:17 crc kubenswrapper[4807]: E1127 11:39:17.535657 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:39:25 crc kubenswrapper[4807]: I1127 11:39:25.380690 4807 scope.go:117] "RemoveContainer" containerID="d6188fb02897da9f75290e681f1df02212cf0a1b2f284fd917be65d34e698154" Nov 27 11:39:25 crc kubenswrapper[4807]: I1127 11:39:25.436391 4807 scope.go:117] "RemoveContainer" containerID="3a8f63156e2a97220998270496878d887c1bf32ef7b88dbac31e38b555e00e8c" Nov 27 11:39:26 crc kubenswrapper[4807]: I1127 11:39:26.054376 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hwsgz"] Nov 27 11:39:26 crc kubenswrapper[4807]: I1127 11:39:26.066848 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hwsgz"] Nov 27 11:39:27 crc kubenswrapper[4807]: I1127 11:39:27.547119 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91906330-92d8-46f3-97a8-a7b2cfd31d6c" path="/var/lib/kubelet/pods/91906330-92d8-46f3-97a8-a7b2cfd31d6c/volumes" Nov 27 11:39:31 crc kubenswrapper[4807]: I1127 11:39:31.532188 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:39:31 crc kubenswrapper[4807]: E1127 11:39:31.533010 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:39:40 crc kubenswrapper[4807]: I1127 11:39:40.274110 4807 generic.go:334] "Generic (PLEG): container finished" podID="e50793ea-c215-407b-ac8f-a5767166a0dd" containerID="bf77d0e970ff53733ce70213e67fe63416fa7e5db636fa0ff6a2fcd3e67cb5b8" exitCode=0 Nov 27 11:39:40 crc kubenswrapper[4807]: I1127 11:39:40.274276 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" event={"ID":"e50793ea-c215-407b-ac8f-a5767166a0dd","Type":"ContainerDied","Data":"bf77d0e970ff53733ce70213e67fe63416fa7e5db636fa0ff6a2fcd3e67cb5b8"} Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.709579 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.844685 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.844733 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8854\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-kube-api-access-z8854\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.844774 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.844811 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-nova-combined-ca-bundle\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.844859 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-libvirt-combined-ca-bundle\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.844900 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ovn-combined-ca-bundle\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.844915 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-bootstrap-combined-ca-bundle\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.844994 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-inventory\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.845027 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-neutron-metadata-combined-ca-bundle\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.845043 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ssh-key\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.845080 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-telemetry-combined-ca-bundle\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.845152 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.845175 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-repo-setup-combined-ca-bundle\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.845207 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e50793ea-c215-407b-ac8f-a5767166a0dd\" (UID: \"e50793ea-c215-407b-ac8f-a5767166a0dd\") " Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.851437 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.851497 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.851659 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-kube-api-access-z8854" (OuterVolumeSpecName: "kube-api-access-z8854") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "kube-api-access-z8854". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.851932 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.853594 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.853665 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.855709 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.855866 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.856682 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.856729 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.857907 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.862899 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.879596 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.886228 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-inventory" (OuterVolumeSpecName: "inventory") pod "e50793ea-c215-407b-ac8f-a5767166a0dd" (UID: "e50793ea-c215-407b-ac8f-a5767166a0dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947167 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947198 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8854\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-kube-api-access-z8854\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947213 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947227 4807 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947240 4807 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947269 4807 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947281 4807 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947293 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947305 4807 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947316 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.947328 4807 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.948304 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.948320 4807 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e50793ea-c215-407b-ac8f-a5767166a0dd-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:41 crc kubenswrapper[4807]: I1127 11:39:41.948334 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e50793ea-c215-407b-ac8f-a5767166a0dd-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.303085 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" event={"ID":"e50793ea-c215-407b-ac8f-a5767166a0dd","Type":"ContainerDied","Data":"33f85f8e363529fac01261a7f5e696c3c1f8a4c693ed24af529972820a8eee01"} Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.303138 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33f85f8e363529fac01261a7f5e696c3c1f8a4c693ed24af529972820a8eee01" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.303161 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.414496 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj"] Nov 27 11:39:42 crc kubenswrapper[4807]: E1127 11:39:42.415014 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50793ea-c215-407b-ac8f-a5767166a0dd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.415040 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50793ea-c215-407b-ac8f-a5767166a0dd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.415367 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e50793ea-c215-407b-ac8f-a5767166a0dd" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.416183 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.421882 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.422016 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj"] Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.423764 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.423847 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.423932 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.424073 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.559482 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.559633 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.559981 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.560106 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.560132 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4c7p\" (UniqueName: \"kubernetes.io/projected/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-kube-api-access-s4c7p\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.662043 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.662687 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.662949 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.663183 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4c7p\" (UniqueName: \"kubernetes.io/projected/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-kube-api-access-s4c7p\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.663278 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.663749 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.666651 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.667467 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.670951 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.688167 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4c7p\" (UniqueName: \"kubernetes.io/projected/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-kube-api-access-s4c7p\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-b2bfj\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:42 crc kubenswrapper[4807]: I1127 11:39:42.766871 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:39:43 crc kubenswrapper[4807]: I1127 11:39:43.289778 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj"] Nov 27 11:39:43 crc kubenswrapper[4807]: I1127 11:39:43.320980 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" event={"ID":"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db","Type":"ContainerStarted","Data":"92095d3ebd5451b2169cb3dd5faf518a362c8d4d92ab0cab142bd2e48010743b"} Nov 27 11:39:44 crc kubenswrapper[4807]: I1127 11:39:44.333059 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" event={"ID":"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db","Type":"ContainerStarted","Data":"c079a4a30c64ea07939384dbdf73f0076cc646ef463c20d745754432df84b760"} Nov 27 11:39:44 crc kubenswrapper[4807]: I1127 11:39:44.358277 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" podStartSLOduration=1.5540233730000002 podStartE2EDuration="2.358232073s" podCreationTimestamp="2025-11-27 11:39:42 +0000 UTC" firstStartedPulling="2025-11-27 11:39:43.298828758 +0000 UTC m=+1824.398326956" lastFinishedPulling="2025-11-27 11:39:44.103037448 +0000 UTC m=+1825.202535656" observedRunningTime="2025-11-27 11:39:44.346617476 +0000 UTC m=+1825.446115674" watchObservedRunningTime="2025-11-27 11:39:44.358232073 +0000 UTC m=+1825.457730302" Nov 27 11:39:45 crc kubenswrapper[4807]: I1127 11:39:45.532682 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:39:45 crc kubenswrapper[4807]: E1127 11:39:45.532930 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:39:57 crc kubenswrapper[4807]: I1127 11:39:57.532771 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:39:57 crc kubenswrapper[4807]: E1127 11:39:57.533656 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:40:09 crc kubenswrapper[4807]: I1127 11:40:09.546322 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:40:09 crc kubenswrapper[4807]: E1127 11:40:09.546986 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:40:21 crc kubenswrapper[4807]: I1127 11:40:21.533344 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:40:21 crc kubenswrapper[4807]: E1127 11:40:21.536170 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:40:25 crc kubenswrapper[4807]: I1127 11:40:25.552284 4807 scope.go:117] "RemoveContainer" containerID="bf8b13017a4dd3696404757dc4f99cea793ecd72143d5a03ea2f71ea0fa9603e" Nov 27 11:40:35 crc kubenswrapper[4807]: I1127 11:40:35.532230 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:40:35 crc kubenswrapper[4807]: E1127 11:40:35.533406 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:40:45 crc kubenswrapper[4807]: I1127 11:40:45.596756 4807 generic.go:334] "Generic (PLEG): container finished" podID="ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db" containerID="c079a4a30c64ea07939384dbdf73f0076cc646ef463c20d745754432df84b760" exitCode=0 Nov 27 11:40:45 crc kubenswrapper[4807]: I1127 11:40:45.596803 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" event={"ID":"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db","Type":"ContainerDied","Data":"c079a4a30c64ea07939384dbdf73f0076cc646ef463c20d745754432df84b760"} Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.066548 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.147920 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-inventory\") pod \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.148205 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ssh-key\") pod \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.148439 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovncontroller-config-0\") pod \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.148644 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4c7p\" (UniqueName: \"kubernetes.io/projected/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-kube-api-access-s4c7p\") pod \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.148738 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovn-combined-ca-bundle\") pod \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\" (UID: \"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db\") " Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.154415 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-kube-api-access-s4c7p" (OuterVolumeSpecName: "kube-api-access-s4c7p") pod "ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db" (UID: "ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db"). InnerVolumeSpecName "kube-api-access-s4c7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.154637 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db" (UID: "ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.180943 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db" (UID: "ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.181868 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-inventory" (OuterVolumeSpecName: "inventory") pod "ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db" (UID: "ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.191233 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db" (UID: "ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.251389 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4c7p\" (UniqueName: \"kubernetes.io/projected/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-kube-api-access-s4c7p\") on node \"crc\" DevicePath \"\"" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.251426 4807 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.251436 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.251444 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.251454 4807 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.619509 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" event={"ID":"ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db","Type":"ContainerDied","Data":"92095d3ebd5451b2169cb3dd5faf518a362c8d4d92ab0cab142bd2e48010743b"} Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.619834 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92095d3ebd5451b2169cb3dd5faf518a362c8d4d92ab0cab142bd2e48010743b" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.619574 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-b2bfj" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.726003 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx"] Nov 27 11:40:47 crc kubenswrapper[4807]: E1127 11:40:47.726412 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.726431 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.726630 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.727212 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.733100 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.734391 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.735270 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx"] Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.735458 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.735607 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.735661 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.735681 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.862275 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.862937 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.863072 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.863298 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8h6k\" (UniqueName: \"kubernetes.io/projected/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-kube-api-access-q8h6k\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.863543 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.863613 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.965854 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.965923 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.965970 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8h6k\" (UniqueName: \"kubernetes.io/projected/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-kube-api-access-q8h6k\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.966020 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.966045 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.966131 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.972859 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.973521 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.974310 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.975532 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.975879 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:47 crc kubenswrapper[4807]: I1127 11:40:47.991487 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8h6k\" (UniqueName: \"kubernetes.io/projected/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-kube-api-access-q8h6k\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:48 crc kubenswrapper[4807]: I1127 11:40:48.091420 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:40:48 crc kubenswrapper[4807]: I1127 11:40:48.589069 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx"] Nov 27 11:40:48 crc kubenswrapper[4807]: I1127 11:40:48.616407 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 11:40:48 crc kubenswrapper[4807]: I1127 11:40:48.640752 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" event={"ID":"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd","Type":"ContainerStarted","Data":"b7f0eba13d2e2e3ea1e30aa5c677a57ad791d7cdecb2587ef0c9ae1caa30a0a5"} Nov 27 11:40:49 crc kubenswrapper[4807]: I1127 11:40:49.540598 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:40:49 crc kubenswrapper[4807]: E1127 11:40:49.541576 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:40:50 crc kubenswrapper[4807]: I1127 11:40:50.658758 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" event={"ID":"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd","Type":"ContainerStarted","Data":"dc278d11433ad6b0ed6fa993d9ebe41533f604903841b7848b9c598ce3dd34db"} Nov 27 11:40:50 crc kubenswrapper[4807]: I1127 11:40:50.681508 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" podStartSLOduration=2.6236265899999998 podStartE2EDuration="3.681488236s" podCreationTimestamp="2025-11-27 11:40:47 +0000 UTC" firstStartedPulling="2025-11-27 11:40:48.615921705 +0000 UTC m=+1889.715419913" lastFinishedPulling="2025-11-27 11:40:49.673783311 +0000 UTC m=+1890.773281559" observedRunningTime="2025-11-27 11:40:50.674010128 +0000 UTC m=+1891.773508326" watchObservedRunningTime="2025-11-27 11:40:50.681488236 +0000 UTC m=+1891.780986434" Nov 27 11:41:00 crc kubenswrapper[4807]: I1127 11:41:00.532732 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:41:00 crc kubenswrapper[4807]: E1127 11:41:00.533758 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:41:15 crc kubenswrapper[4807]: I1127 11:41:15.533390 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:41:15 crc kubenswrapper[4807]: E1127 11:41:15.534573 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:41:28 crc kubenswrapper[4807]: I1127 11:41:28.533061 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:41:28 crc kubenswrapper[4807]: E1127 11:41:28.534080 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:41:36 crc kubenswrapper[4807]: I1127 11:41:36.221354 4807 generic.go:334] "Generic (PLEG): container finished" podID="2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd" containerID="dc278d11433ad6b0ed6fa993d9ebe41533f604903841b7848b9c598ce3dd34db" exitCode=0 Nov 27 11:41:36 crc kubenswrapper[4807]: I1127 11:41:36.221481 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" event={"ID":"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd","Type":"ContainerDied","Data":"dc278d11433ad6b0ed6fa993d9ebe41533f604903841b7848b9c598ce3dd34db"} Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.779026 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.829666 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-ssh-key\") pod \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.829716 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-metadata-combined-ca-bundle\") pod \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.829770 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-inventory\") pod \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.829801 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-ovn-metadata-agent-neutron-config-0\") pod \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.829843 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8h6k\" (UniqueName: \"kubernetes.io/projected/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-kube-api-access-q8h6k\") pod \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.829883 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-nova-metadata-neutron-config-0\") pod \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\" (UID: \"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd\") " Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.835302 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd" (UID: "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.840558 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-kube-api-access-q8h6k" (OuterVolumeSpecName: "kube-api-access-q8h6k") pod "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd" (UID: "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd"). InnerVolumeSpecName "kube-api-access-q8h6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.859422 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd" (UID: "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.863503 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd" (UID: "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.866031 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-inventory" (OuterVolumeSpecName: "inventory") pod "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd" (UID: "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.882201 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd" (UID: "2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.932481 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.932529 4807 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.932553 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.932576 4807 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.932596 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8h6k\" (UniqueName: \"kubernetes.io/projected/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-kube-api-access-q8h6k\") on node \"crc\" DevicePath \"\"" Nov 27 11:41:37 crc kubenswrapper[4807]: I1127 11:41:37.932616 4807 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.245409 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" event={"ID":"2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd","Type":"ContainerDied","Data":"b7f0eba13d2e2e3ea1e30aa5c677a57ad791d7cdecb2587ef0c9ae1caa30a0a5"} Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.245453 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7f0eba13d2e2e3ea1e30aa5c677a57ad791d7cdecb2587ef0c9ae1caa30a0a5" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.245527 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.360797 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm"] Nov 27 11:41:38 crc kubenswrapper[4807]: E1127 11:41:38.361932 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.361980 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.362546 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.363912 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.365856 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.366332 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.366430 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.367288 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.368757 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.386726 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm"] Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.440595 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.440672 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.440748 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.440817 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.440884 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktq9s\" (UniqueName: \"kubernetes.io/projected/36b0f83c-c6d3-4d4b-9675-478b3f02f952-kube-api-access-ktq9s\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.552895 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktq9s\" (UniqueName: \"kubernetes.io/projected/36b0f83c-c6d3-4d4b-9675-478b3f02f952-kube-api-access-ktq9s\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.554223 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.554398 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.554631 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.554828 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.562333 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.563052 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.566311 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.567620 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.573681 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktq9s\" (UniqueName: \"kubernetes.io/projected/36b0f83c-c6d3-4d4b-9675-478b3f02f952-kube-api-access-ktq9s\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:38 crc kubenswrapper[4807]: I1127 11:41:38.681402 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:41:39 crc kubenswrapper[4807]: I1127 11:41:39.339869 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm"] Nov 27 11:41:40 crc kubenswrapper[4807]: I1127 11:41:40.269810 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" event={"ID":"36b0f83c-c6d3-4d4b-9675-478b3f02f952","Type":"ContainerStarted","Data":"c9ada6ef2802d0ec7ecc5217fa1c4dc717cb8609d23ee9bb4af67f91544e2deb"} Nov 27 11:41:40 crc kubenswrapper[4807]: I1127 11:41:40.270436 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" event={"ID":"36b0f83c-c6d3-4d4b-9675-478b3f02f952","Type":"ContainerStarted","Data":"3bf7ed0eca421cfd1e3abe1349b76d0f6294ac10b3a5a27d122ae83fa623ff4d"} Nov 27 11:41:40 crc kubenswrapper[4807]: I1127 11:41:40.296196 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" podStartSLOduration=1.873355764 podStartE2EDuration="2.296168933s" podCreationTimestamp="2025-11-27 11:41:38 +0000 UTC" firstStartedPulling="2025-11-27 11:41:39.356633985 +0000 UTC m=+1940.456132203" lastFinishedPulling="2025-11-27 11:41:39.779447164 +0000 UTC m=+1940.878945372" observedRunningTime="2025-11-27 11:41:40.291308454 +0000 UTC m=+1941.390806692" watchObservedRunningTime="2025-11-27 11:41:40.296168933 +0000 UTC m=+1941.395667171" Nov 27 11:41:40 crc kubenswrapper[4807]: I1127 11:41:40.532677 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:41:40 crc kubenswrapper[4807]: E1127 11:41:40.533376 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:41:52 crc kubenswrapper[4807]: I1127 11:41:52.533869 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:41:52 crc kubenswrapper[4807]: E1127 11:41:52.535434 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:42:07 crc kubenswrapper[4807]: I1127 11:42:07.532005 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:42:07 crc kubenswrapper[4807]: E1127 11:42:07.532777 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:42:21 crc kubenswrapper[4807]: I1127 11:42:21.532834 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:42:22 crc kubenswrapper[4807]: I1127 11:42:22.725776 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"912ea98b94c351525e710a12386f0d7c3210cb64ee176bb082a9b11ed97b0455"} Nov 27 11:44:50 crc kubenswrapper[4807]: I1127 11:44:50.922039 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:44:50 crc kubenswrapper[4807]: I1127 11:44:50.922698 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.143213 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6"] Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.145205 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.154971 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.155050 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.156080 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6"] Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.171043 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42h7w\" (UniqueName: \"kubernetes.io/projected/e6c1f659-9759-4819-9856-cd307bfea2bf-kube-api-access-42h7w\") pod \"collect-profiles-29404065-n5sg6\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.171334 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6c1f659-9759-4819-9856-cd307bfea2bf-config-volume\") pod \"collect-profiles-29404065-n5sg6\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.171388 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6c1f659-9759-4819-9856-cd307bfea2bf-secret-volume\") pod \"collect-profiles-29404065-n5sg6\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.274049 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6c1f659-9759-4819-9856-cd307bfea2bf-config-volume\") pod \"collect-profiles-29404065-n5sg6\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.274100 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6c1f659-9759-4819-9856-cd307bfea2bf-secret-volume\") pod \"collect-profiles-29404065-n5sg6\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.274131 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42h7w\" (UniqueName: \"kubernetes.io/projected/e6c1f659-9759-4819-9856-cd307bfea2bf-kube-api-access-42h7w\") pod \"collect-profiles-29404065-n5sg6\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.274869 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6c1f659-9759-4819-9856-cd307bfea2bf-config-volume\") pod \"collect-profiles-29404065-n5sg6\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.279882 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6c1f659-9759-4819-9856-cd307bfea2bf-secret-volume\") pod \"collect-profiles-29404065-n5sg6\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.289902 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42h7w\" (UniqueName: \"kubernetes.io/projected/e6c1f659-9759-4819-9856-cd307bfea2bf-kube-api-access-42h7w\") pod \"collect-profiles-29404065-n5sg6\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.471941 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:00 crc kubenswrapper[4807]: I1127 11:45:00.964234 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6"] Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.152351 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fhxw2"] Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.154873 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.161666 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fhxw2"] Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.190931 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-catalog-content\") pod \"redhat-operators-fhxw2\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.191034 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-utilities\") pod \"redhat-operators-fhxw2\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.191072 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xfwr\" (UniqueName: \"kubernetes.io/projected/279e9587-9efa-457e-aa2e-a7094cdfc488-kube-api-access-6xfwr\") pod \"redhat-operators-fhxw2\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.234831 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" event={"ID":"e6c1f659-9759-4819-9856-cd307bfea2bf","Type":"ContainerStarted","Data":"739ee50b485307d1cdb7e915b3561084ec434362be515676f2471c3e5c73c329"} Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.234877 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" event={"ID":"e6c1f659-9759-4819-9856-cd307bfea2bf","Type":"ContainerStarted","Data":"5a5a798a91a9d2e46f31024855c01720d6241e5fbf38f9bac6e0e804588b6315"} Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.258290 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" podStartSLOduration=1.258266927 podStartE2EDuration="1.258266927s" podCreationTimestamp="2025-11-27 11:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 11:45:01.248512748 +0000 UTC m=+2142.348010946" watchObservedRunningTime="2025-11-27 11:45:01.258266927 +0000 UTC m=+2142.357765135" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.292557 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-catalog-content\") pod \"redhat-operators-fhxw2\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.292714 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-utilities\") pod \"redhat-operators-fhxw2\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.292760 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xfwr\" (UniqueName: \"kubernetes.io/projected/279e9587-9efa-457e-aa2e-a7094cdfc488-kube-api-access-6xfwr\") pod \"redhat-operators-fhxw2\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.293587 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-utilities\") pod \"redhat-operators-fhxw2\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.293757 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-catalog-content\") pod \"redhat-operators-fhxw2\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.313110 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xfwr\" (UniqueName: \"kubernetes.io/projected/279e9587-9efa-457e-aa2e-a7094cdfc488-kube-api-access-6xfwr\") pod \"redhat-operators-fhxw2\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.471971 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:01 crc kubenswrapper[4807]: I1127 11:45:01.921690 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fhxw2"] Nov 27 11:45:01 crc kubenswrapper[4807]: W1127 11:45:01.924060 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod279e9587_9efa_457e_aa2e_a7094cdfc488.slice/crio-20771187bf535a46a5a9b221a21c48619ad650255b0e856a8c4bfd9ce85f3aec WatchSource:0}: Error finding container 20771187bf535a46a5a9b221a21c48619ad650255b0e856a8c4bfd9ce85f3aec: Status 404 returned error can't find the container with id 20771187bf535a46a5a9b221a21c48619ad650255b0e856a8c4bfd9ce85f3aec Nov 27 11:45:02 crc kubenswrapper[4807]: I1127 11:45:02.244367 4807 generic.go:334] "Generic (PLEG): container finished" podID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerID="71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c" exitCode=0 Nov 27 11:45:02 crc kubenswrapper[4807]: I1127 11:45:02.244433 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhxw2" event={"ID":"279e9587-9efa-457e-aa2e-a7094cdfc488","Type":"ContainerDied","Data":"71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c"} Nov 27 11:45:02 crc kubenswrapper[4807]: I1127 11:45:02.244457 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhxw2" event={"ID":"279e9587-9efa-457e-aa2e-a7094cdfc488","Type":"ContainerStarted","Data":"20771187bf535a46a5a9b221a21c48619ad650255b0e856a8c4bfd9ce85f3aec"} Nov 27 11:45:02 crc kubenswrapper[4807]: I1127 11:45:02.246220 4807 generic.go:334] "Generic (PLEG): container finished" podID="e6c1f659-9759-4819-9856-cd307bfea2bf" containerID="739ee50b485307d1cdb7e915b3561084ec434362be515676f2471c3e5c73c329" exitCode=0 Nov 27 11:45:02 crc kubenswrapper[4807]: I1127 11:45:02.246273 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" event={"ID":"e6c1f659-9759-4819-9856-cd307bfea2bf","Type":"ContainerDied","Data":"739ee50b485307d1cdb7e915b3561084ec434362be515676f2471c3e5c73c329"} Nov 27 11:45:03 crc kubenswrapper[4807]: I1127 11:45:03.618578 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:03 crc kubenswrapper[4807]: I1127 11:45:03.736608 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6c1f659-9759-4819-9856-cd307bfea2bf-config-volume\") pod \"e6c1f659-9759-4819-9856-cd307bfea2bf\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " Nov 27 11:45:03 crc kubenswrapper[4807]: I1127 11:45:03.736661 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42h7w\" (UniqueName: \"kubernetes.io/projected/e6c1f659-9759-4819-9856-cd307bfea2bf-kube-api-access-42h7w\") pod \"e6c1f659-9759-4819-9856-cd307bfea2bf\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " Nov 27 11:45:03 crc kubenswrapper[4807]: I1127 11:45:03.736767 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6c1f659-9759-4819-9856-cd307bfea2bf-secret-volume\") pod \"e6c1f659-9759-4819-9856-cd307bfea2bf\" (UID: \"e6c1f659-9759-4819-9856-cd307bfea2bf\") " Nov 27 11:45:03 crc kubenswrapper[4807]: I1127 11:45:03.737701 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6c1f659-9759-4819-9856-cd307bfea2bf-config-volume" (OuterVolumeSpecName: "config-volume") pod "e6c1f659-9759-4819-9856-cd307bfea2bf" (UID: "e6c1f659-9759-4819-9856-cd307bfea2bf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:45:03 crc kubenswrapper[4807]: I1127 11:45:03.738365 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6c1f659-9759-4819-9856-cd307bfea2bf-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:03 crc kubenswrapper[4807]: I1127 11:45:03.743000 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6c1f659-9759-4819-9856-cd307bfea2bf-kube-api-access-42h7w" (OuterVolumeSpecName: "kube-api-access-42h7w") pod "e6c1f659-9759-4819-9856-cd307bfea2bf" (UID: "e6c1f659-9759-4819-9856-cd307bfea2bf"). InnerVolumeSpecName "kube-api-access-42h7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:45:03 crc kubenswrapper[4807]: I1127 11:45:03.743220 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6c1f659-9759-4819-9856-cd307bfea2bf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e6c1f659-9759-4819-9856-cd307bfea2bf" (UID: "e6c1f659-9759-4819-9856-cd307bfea2bf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:45:03 crc kubenswrapper[4807]: I1127 11:45:03.844919 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6c1f659-9759-4819-9856-cd307bfea2bf-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:03 crc kubenswrapper[4807]: I1127 11:45:03.844959 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42h7w\" (UniqueName: \"kubernetes.io/projected/e6c1f659-9759-4819-9856-cd307bfea2bf-kube-api-access-42h7w\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:04 crc kubenswrapper[4807]: I1127 11:45:04.264535 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" event={"ID":"e6c1f659-9759-4819-9856-cd307bfea2bf","Type":"ContainerDied","Data":"5a5a798a91a9d2e46f31024855c01720d6241e5fbf38f9bac6e0e804588b6315"} Nov 27 11:45:04 crc kubenswrapper[4807]: I1127 11:45:04.264573 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a5a798a91a9d2e46f31024855c01720d6241e5fbf38f9bac6e0e804588b6315" Nov 27 11:45:04 crc kubenswrapper[4807]: I1127 11:45:04.264620 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404065-n5sg6" Nov 27 11:45:04 crc kubenswrapper[4807]: I1127 11:45:04.317438 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n"] Nov 27 11:45:04 crc kubenswrapper[4807]: I1127 11:45:04.325310 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404020-wx78n"] Nov 27 11:45:05 crc kubenswrapper[4807]: I1127 11:45:05.273537 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhxw2" event={"ID":"279e9587-9efa-457e-aa2e-a7094cdfc488","Type":"ContainerStarted","Data":"1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08"} Nov 27 11:45:05 crc kubenswrapper[4807]: I1127 11:45:05.545469 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acbc9004-f19e-419d-b609-2f9dda223b0d" path="/var/lib/kubelet/pods/acbc9004-f19e-419d-b609-2f9dda223b0d/volumes" Nov 27 11:45:07 crc kubenswrapper[4807]: I1127 11:45:07.322223 4807 generic.go:334] "Generic (PLEG): container finished" podID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerID="1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08" exitCode=0 Nov 27 11:45:07 crc kubenswrapper[4807]: I1127 11:45:07.322456 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhxw2" event={"ID":"279e9587-9efa-457e-aa2e-a7094cdfc488","Type":"ContainerDied","Data":"1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08"} Nov 27 11:45:08 crc kubenswrapper[4807]: I1127 11:45:08.334432 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhxw2" event={"ID":"279e9587-9efa-457e-aa2e-a7094cdfc488","Type":"ContainerStarted","Data":"129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2"} Nov 27 11:45:08 crc kubenswrapper[4807]: I1127 11:45:08.359429 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fhxw2" podStartSLOduration=1.778421915 podStartE2EDuration="7.359414669s" podCreationTimestamp="2025-11-27 11:45:01 +0000 UTC" firstStartedPulling="2025-11-27 11:45:02.245914485 +0000 UTC m=+2143.345412683" lastFinishedPulling="2025-11-27 11:45:07.826907239 +0000 UTC m=+2148.926405437" observedRunningTime="2025-11-27 11:45:08.358405162 +0000 UTC m=+2149.457903350" watchObservedRunningTime="2025-11-27 11:45:08.359414669 +0000 UTC m=+2149.458912867" Nov 27 11:45:11 crc kubenswrapper[4807]: I1127 11:45:11.472389 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:11 crc kubenswrapper[4807]: I1127 11:45:11.474900 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:12 crc kubenswrapper[4807]: I1127 11:45:12.538280 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fhxw2" podUID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerName="registry-server" probeResult="failure" output=< Nov 27 11:45:12 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Nov 27 11:45:12 crc kubenswrapper[4807]: > Nov 27 11:45:20 crc kubenswrapper[4807]: I1127 11:45:20.921626 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:45:20 crc kubenswrapper[4807]: I1127 11:45:20.922235 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:45:21 crc kubenswrapper[4807]: I1127 11:45:21.545070 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:21 crc kubenswrapper[4807]: I1127 11:45:21.603886 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:21 crc kubenswrapper[4807]: I1127 11:45:21.784291 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fhxw2"] Nov 27 11:45:23 crc kubenswrapper[4807]: I1127 11:45:23.490235 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fhxw2" podUID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerName="registry-server" containerID="cri-o://129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2" gracePeriod=2 Nov 27 11:45:23 crc kubenswrapper[4807]: I1127 11:45:23.960534 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.137093 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xfwr\" (UniqueName: \"kubernetes.io/projected/279e9587-9efa-457e-aa2e-a7094cdfc488-kube-api-access-6xfwr\") pod \"279e9587-9efa-457e-aa2e-a7094cdfc488\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.137346 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-catalog-content\") pod \"279e9587-9efa-457e-aa2e-a7094cdfc488\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.137562 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-utilities\") pod \"279e9587-9efa-457e-aa2e-a7094cdfc488\" (UID: \"279e9587-9efa-457e-aa2e-a7094cdfc488\") " Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.138987 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-utilities" (OuterVolumeSpecName: "utilities") pod "279e9587-9efa-457e-aa2e-a7094cdfc488" (UID: "279e9587-9efa-457e-aa2e-a7094cdfc488"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.147074 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279e9587-9efa-457e-aa2e-a7094cdfc488-kube-api-access-6xfwr" (OuterVolumeSpecName: "kube-api-access-6xfwr") pod "279e9587-9efa-457e-aa2e-a7094cdfc488" (UID: "279e9587-9efa-457e-aa2e-a7094cdfc488"). InnerVolumeSpecName "kube-api-access-6xfwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.241000 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.241049 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xfwr\" (UniqueName: \"kubernetes.io/projected/279e9587-9efa-457e-aa2e-a7094cdfc488-kube-api-access-6xfwr\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.241315 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "279e9587-9efa-457e-aa2e-a7094cdfc488" (UID: "279e9587-9efa-457e-aa2e-a7094cdfc488"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.343523 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279e9587-9efa-457e-aa2e-a7094cdfc488-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.506626 4807 generic.go:334] "Generic (PLEG): container finished" podID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerID="129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2" exitCode=0 Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.506679 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhxw2" event={"ID":"279e9587-9efa-457e-aa2e-a7094cdfc488","Type":"ContainerDied","Data":"129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2"} Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.506707 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fhxw2" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.506740 4807 scope.go:117] "RemoveContainer" containerID="129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.506725 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fhxw2" event={"ID":"279e9587-9efa-457e-aa2e-a7094cdfc488","Type":"ContainerDied","Data":"20771187bf535a46a5a9b221a21c48619ad650255b0e856a8c4bfd9ce85f3aec"} Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.548588 4807 scope.go:117] "RemoveContainer" containerID="1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.562200 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fhxw2"] Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.570610 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fhxw2"] Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.592863 4807 scope.go:117] "RemoveContainer" containerID="71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.652103 4807 scope.go:117] "RemoveContainer" containerID="129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2" Nov 27 11:45:24 crc kubenswrapper[4807]: E1127 11:45:24.652915 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2\": container with ID starting with 129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2 not found: ID does not exist" containerID="129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.652943 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2"} err="failed to get container status \"129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2\": rpc error: code = NotFound desc = could not find container \"129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2\": container with ID starting with 129d392a4ccc7ceea7e9a081a298068086f3f0334fbbe4e0d5e367446022afa2 not found: ID does not exist" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.652963 4807 scope.go:117] "RemoveContainer" containerID="1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08" Nov 27 11:45:24 crc kubenswrapper[4807]: E1127 11:45:24.653330 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08\": container with ID starting with 1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08 not found: ID does not exist" containerID="1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.653381 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08"} err="failed to get container status \"1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08\": rpc error: code = NotFound desc = could not find container \"1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08\": container with ID starting with 1d6bb9fa121f4df01811950ad0ddfb78b73144fff846b3b9a489cc94c2515f08 not found: ID does not exist" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.653413 4807 scope.go:117] "RemoveContainer" containerID="71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c" Nov 27 11:45:24 crc kubenswrapper[4807]: E1127 11:45:24.653950 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c\": container with ID starting with 71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c not found: ID does not exist" containerID="71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c" Nov 27 11:45:24 crc kubenswrapper[4807]: I1127 11:45:24.653998 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c"} err="failed to get container status \"71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c\": rpc error: code = NotFound desc = could not find container \"71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c\": container with ID starting with 71ed12f89e6480358efc35c17d3fa6cccde46174654dcd32e0bf7c599b0a233c not found: ID does not exist" Nov 27 11:45:25 crc kubenswrapper[4807]: I1127 11:45:25.541744 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279e9587-9efa-457e-aa2e-a7094cdfc488" path="/var/lib/kubelet/pods/279e9587-9efa-457e-aa2e-a7094cdfc488/volumes" Nov 27 11:45:25 crc kubenswrapper[4807]: I1127 11:45:25.759381 4807 scope.go:117] "RemoveContainer" containerID="070b8fbc8145631b85d09fdc49b50fcf20feaa04047883449ef385dc134ab75e" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.669401 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gb5j4"] Nov 27 11:45:32 crc kubenswrapper[4807]: E1127 11:45:32.670468 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerName="extract-content" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.670484 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerName="extract-content" Nov 27 11:45:32 crc kubenswrapper[4807]: E1127 11:45:32.670508 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6c1f659-9759-4819-9856-cd307bfea2bf" containerName="collect-profiles" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.670516 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6c1f659-9759-4819-9856-cd307bfea2bf" containerName="collect-profiles" Nov 27 11:45:32 crc kubenswrapper[4807]: E1127 11:45:32.670547 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerName="extract-utilities" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.670556 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerName="extract-utilities" Nov 27 11:45:32 crc kubenswrapper[4807]: E1127 11:45:32.670567 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerName="registry-server" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.670574 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerName="registry-server" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.670827 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6c1f659-9759-4819-9856-cd307bfea2bf" containerName="collect-profiles" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.670849 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="279e9587-9efa-457e-aa2e-a7094cdfc488" containerName="registry-server" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.672535 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.681835 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gb5j4"] Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.866644 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzm8h\" (UniqueName: \"kubernetes.io/projected/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-kube-api-access-vzm8h\") pod \"community-operators-gb5j4\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.866688 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-catalog-content\") pod \"community-operators-gb5j4\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.867479 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-utilities\") pod \"community-operators-gb5j4\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.969131 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzm8h\" (UniqueName: \"kubernetes.io/projected/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-kube-api-access-vzm8h\") pod \"community-operators-gb5j4\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.969183 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-catalog-content\") pod \"community-operators-gb5j4\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.969337 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-utilities\") pod \"community-operators-gb5j4\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.970045 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-utilities\") pod \"community-operators-gb5j4\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:32 crc kubenswrapper[4807]: I1127 11:45:32.970294 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-catalog-content\") pod \"community-operators-gb5j4\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:33 crc kubenswrapper[4807]: I1127 11:45:33.003401 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzm8h\" (UniqueName: \"kubernetes.io/projected/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-kube-api-access-vzm8h\") pod \"community-operators-gb5j4\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:33 crc kubenswrapper[4807]: I1127 11:45:33.303155 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:33 crc kubenswrapper[4807]: I1127 11:45:33.897436 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gb5j4"] Nov 27 11:45:34 crc kubenswrapper[4807]: I1127 11:45:34.601466 4807 generic.go:334] "Generic (PLEG): container finished" podID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerID="f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627" exitCode=0 Nov 27 11:45:34 crc kubenswrapper[4807]: I1127 11:45:34.601553 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gb5j4" event={"ID":"ac809f4a-2962-4d2e-acc5-ee39a20bb86e","Type":"ContainerDied","Data":"f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627"} Nov 27 11:45:34 crc kubenswrapper[4807]: I1127 11:45:34.601819 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gb5j4" event={"ID":"ac809f4a-2962-4d2e-acc5-ee39a20bb86e","Type":"ContainerStarted","Data":"e1638567ef8c329642161b35c9bd350c5c241d4189ff33af98e965354a8ecbff"} Nov 27 11:45:36 crc kubenswrapper[4807]: I1127 11:45:36.632343 4807 generic.go:334] "Generic (PLEG): container finished" podID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerID="b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827" exitCode=0 Nov 27 11:45:36 crc kubenswrapper[4807]: I1127 11:45:36.632375 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gb5j4" event={"ID":"ac809f4a-2962-4d2e-acc5-ee39a20bb86e","Type":"ContainerDied","Data":"b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827"} Nov 27 11:45:37 crc kubenswrapper[4807]: I1127 11:45:37.652212 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gb5j4" event={"ID":"ac809f4a-2962-4d2e-acc5-ee39a20bb86e","Type":"ContainerStarted","Data":"3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1"} Nov 27 11:45:37 crc kubenswrapper[4807]: I1127 11:45:37.675115 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gb5j4" podStartSLOduration=3.079295281 podStartE2EDuration="5.67509687s" podCreationTimestamp="2025-11-27 11:45:32 +0000 UTC" firstStartedPulling="2025-11-27 11:45:34.602798056 +0000 UTC m=+2175.702296254" lastFinishedPulling="2025-11-27 11:45:37.198599645 +0000 UTC m=+2178.298097843" observedRunningTime="2025-11-27 11:45:37.670678753 +0000 UTC m=+2178.770176971" watchObservedRunningTime="2025-11-27 11:45:37.67509687 +0000 UTC m=+2178.774595058" Nov 27 11:45:41 crc kubenswrapper[4807]: I1127 11:45:41.689203 4807 generic.go:334] "Generic (PLEG): container finished" podID="36b0f83c-c6d3-4d4b-9675-478b3f02f952" containerID="c9ada6ef2802d0ec7ecc5217fa1c4dc717cb8609d23ee9bb4af67f91544e2deb" exitCode=0 Nov 27 11:45:41 crc kubenswrapper[4807]: I1127 11:45:41.689293 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" event={"ID":"36b0f83c-c6d3-4d4b-9675-478b3f02f952","Type":"ContainerDied","Data":"c9ada6ef2802d0ec7ecc5217fa1c4dc717cb8609d23ee9bb4af67f91544e2deb"} Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.111387 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.193266 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-secret-0\") pod \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.193313 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-combined-ca-bundle\") pod \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.193407 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktq9s\" (UniqueName: \"kubernetes.io/projected/36b0f83c-c6d3-4d4b-9675-478b3f02f952-kube-api-access-ktq9s\") pod \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.193439 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-inventory\") pod \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.193603 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-ssh-key\") pod \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\" (UID: \"36b0f83c-c6d3-4d4b-9675-478b3f02f952\") " Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.198819 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "36b0f83c-c6d3-4d4b-9675-478b3f02f952" (UID: "36b0f83c-c6d3-4d4b-9675-478b3f02f952"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.199024 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36b0f83c-c6d3-4d4b-9675-478b3f02f952-kube-api-access-ktq9s" (OuterVolumeSpecName: "kube-api-access-ktq9s") pod "36b0f83c-c6d3-4d4b-9675-478b3f02f952" (UID: "36b0f83c-c6d3-4d4b-9675-478b3f02f952"). InnerVolumeSpecName "kube-api-access-ktq9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.219594 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36b0f83c-c6d3-4d4b-9675-478b3f02f952" (UID: "36b0f83c-c6d3-4d4b-9675-478b3f02f952"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.221670 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "36b0f83c-c6d3-4d4b-9675-478b3f02f952" (UID: "36b0f83c-c6d3-4d4b-9675-478b3f02f952"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.241816 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-inventory" (OuterVolumeSpecName: "inventory") pod "36b0f83c-c6d3-4d4b-9675-478b3f02f952" (UID: "36b0f83c-c6d3-4d4b-9675-478b3f02f952"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.296658 4807 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.296712 4807 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.296727 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktq9s\" (UniqueName: \"kubernetes.io/projected/36b0f83c-c6d3-4d4b-9675-478b3f02f952-kube-api-access-ktq9s\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.296739 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.296752 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36b0f83c-c6d3-4d4b-9675-478b3f02f952-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.303864 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.303924 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.351095 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.705754 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" event={"ID":"36b0f83c-c6d3-4d4b-9675-478b3f02f952","Type":"ContainerDied","Data":"3bf7ed0eca421cfd1e3abe1349b76d0f6294ac10b3a5a27d122ae83fa623ff4d"} Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.705803 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bf7ed0eca421cfd1e3abe1349b76d0f6294ac10b3a5a27d122ae83fa623ff4d" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.705805 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.757366 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.803003 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz"] Nov 27 11:45:43 crc kubenswrapper[4807]: E1127 11:45:43.803479 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b0f83c-c6d3-4d4b-9675-478b3f02f952" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.803502 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b0f83c-c6d3-4d4b-9675-478b3f02f952" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.803732 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="36b0f83c-c6d3-4d4b-9675-478b3f02f952" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.804494 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.809048 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.809075 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.809718 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.809877 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.810148 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.810170 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.810214 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.823763 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gb5j4"] Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.834121 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz"] Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.907536 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.907618 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.907651 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.907675 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.907693 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.907747 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.907772 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.907821 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:43 crc kubenswrapper[4807]: I1127 11:45:43.907841 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62kl5\" (UniqueName: \"kubernetes.io/projected/08c2cd76-cfdb-4de6-ac04-8925b75415fa-kube-api-access-62kl5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.008684 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.008732 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62kl5\" (UniqueName: \"kubernetes.io/projected/08c2cd76-cfdb-4de6-ac04-8925b75415fa-kube-api-access-62kl5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.008776 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.008808 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.008839 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.008857 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.008878 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.008929 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.008956 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.011649 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.014407 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.014755 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.014916 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.014986 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.015211 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.021805 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.021899 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.026626 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62kl5\" (UniqueName: \"kubernetes.io/projected/08c2cd76-cfdb-4de6-ac04-8925b75415fa-kube-api-access-62kl5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-xnswz\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.118689 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.632801 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz"] Nov 27 11:45:44 crc kubenswrapper[4807]: I1127 11:45:44.714425 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" event={"ID":"08c2cd76-cfdb-4de6-ac04-8925b75415fa","Type":"ContainerStarted","Data":"e31c221a01dcfa34baceb661f098a23099b3673b6abd13a801fd59b355fe10bd"} Nov 27 11:45:45 crc kubenswrapper[4807]: I1127 11:45:45.723811 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" event={"ID":"08c2cd76-cfdb-4de6-ac04-8925b75415fa","Type":"ContainerStarted","Data":"e992476a320c33b35c44c05376dfc3949a07906a7074ab32f84c91c4060238e6"} Nov 27 11:45:45 crc kubenswrapper[4807]: I1127 11:45:45.723940 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gb5j4" podUID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerName="registry-server" containerID="cri-o://3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1" gracePeriod=2 Nov 27 11:45:45 crc kubenswrapper[4807]: I1127 11:45:45.765149 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" podStartSLOduration=2.291186966 podStartE2EDuration="2.765127092s" podCreationTimestamp="2025-11-27 11:45:43 +0000 UTC" firstStartedPulling="2025-11-27 11:45:44.63954044 +0000 UTC m=+2185.739038638" lastFinishedPulling="2025-11-27 11:45:45.113480566 +0000 UTC m=+2186.212978764" observedRunningTime="2025-11-27 11:45:45.743003396 +0000 UTC m=+2186.842501594" watchObservedRunningTime="2025-11-27 11:45:45.765127092 +0000 UTC m=+2186.864625280" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.174407 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.249030 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-catalog-content\") pod \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.249173 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-utilities\") pod \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.249198 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzm8h\" (UniqueName: \"kubernetes.io/projected/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-kube-api-access-vzm8h\") pod \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\" (UID: \"ac809f4a-2962-4d2e-acc5-ee39a20bb86e\") " Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.250124 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-utilities" (OuterVolumeSpecName: "utilities") pod "ac809f4a-2962-4d2e-acc5-ee39a20bb86e" (UID: "ac809f4a-2962-4d2e-acc5-ee39a20bb86e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.255066 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-kube-api-access-vzm8h" (OuterVolumeSpecName: "kube-api-access-vzm8h") pod "ac809f4a-2962-4d2e-acc5-ee39a20bb86e" (UID: "ac809f4a-2962-4d2e-acc5-ee39a20bb86e"). InnerVolumeSpecName "kube-api-access-vzm8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.298991 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac809f4a-2962-4d2e-acc5-ee39a20bb86e" (UID: "ac809f4a-2962-4d2e-acc5-ee39a20bb86e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.350592 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.350687 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.350697 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzm8h\" (UniqueName: \"kubernetes.io/projected/ac809f4a-2962-4d2e-acc5-ee39a20bb86e-kube-api-access-vzm8h\") on node \"crc\" DevicePath \"\"" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.734670 4807 generic.go:334] "Generic (PLEG): container finished" podID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerID="3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1" exitCode=0 Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.735336 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gb5j4" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.735333 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gb5j4" event={"ID":"ac809f4a-2962-4d2e-acc5-ee39a20bb86e","Type":"ContainerDied","Data":"3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1"} Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.735467 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gb5j4" event={"ID":"ac809f4a-2962-4d2e-acc5-ee39a20bb86e","Type":"ContainerDied","Data":"e1638567ef8c329642161b35c9bd350c5c241d4189ff33af98e965354a8ecbff"} Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.735489 4807 scope.go:117] "RemoveContainer" containerID="3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.756327 4807 scope.go:117] "RemoveContainer" containerID="b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.781863 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gb5j4"] Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.796648 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gb5j4"] Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.800225 4807 scope.go:117] "RemoveContainer" containerID="f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.834979 4807 scope.go:117] "RemoveContainer" containerID="3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1" Nov 27 11:45:46 crc kubenswrapper[4807]: E1127 11:45:46.835741 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1\": container with ID starting with 3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1 not found: ID does not exist" containerID="3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.835775 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1"} err="failed to get container status \"3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1\": rpc error: code = NotFound desc = could not find container \"3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1\": container with ID starting with 3ef717fac549d81d598adc08b334495813d81d6ad2051dcd6cedadbb6dc6dee1 not found: ID does not exist" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.835812 4807 scope.go:117] "RemoveContainer" containerID="b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827" Nov 27 11:45:46 crc kubenswrapper[4807]: E1127 11:45:46.836516 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827\": container with ID starting with b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827 not found: ID does not exist" containerID="b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.836540 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827"} err="failed to get container status \"b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827\": rpc error: code = NotFound desc = could not find container \"b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827\": container with ID starting with b72d719e1075713c445cefddf3c83b25d38bec8a516b454278088c6398bbe827 not found: ID does not exist" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.836553 4807 scope.go:117] "RemoveContainer" containerID="f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627" Nov 27 11:45:46 crc kubenswrapper[4807]: E1127 11:45:46.836815 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627\": container with ID starting with f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627 not found: ID does not exist" containerID="f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627" Nov 27 11:45:46 crc kubenswrapper[4807]: I1127 11:45:46.836837 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627"} err="failed to get container status \"f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627\": rpc error: code = NotFound desc = could not find container \"f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627\": container with ID starting with f473cce17908043998e19b86dd974b515962e3470f1dc9143646952824f1a627 not found: ID does not exist" Nov 27 11:45:47 crc kubenswrapper[4807]: I1127 11:45:47.549329 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" path="/var/lib/kubelet/pods/ac809f4a-2962-4d2e-acc5-ee39a20bb86e/volumes" Nov 27 11:45:50 crc kubenswrapper[4807]: I1127 11:45:50.922052 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:45:50 crc kubenswrapper[4807]: I1127 11:45:50.923450 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:45:50 crc kubenswrapper[4807]: I1127 11:45:50.923738 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:45:50 crc kubenswrapper[4807]: I1127 11:45:50.924512 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"912ea98b94c351525e710a12386f0d7c3210cb64ee176bb082a9b11ed97b0455"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:45:50 crc kubenswrapper[4807]: I1127 11:45:50.924579 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://912ea98b94c351525e710a12386f0d7c3210cb64ee176bb082a9b11ed97b0455" gracePeriod=600 Nov 27 11:45:51 crc kubenswrapper[4807]: I1127 11:45:51.779073 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="912ea98b94c351525e710a12386f0d7c3210cb64ee176bb082a9b11ed97b0455" exitCode=0 Nov 27 11:45:51 crc kubenswrapper[4807]: I1127 11:45:51.779156 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"912ea98b94c351525e710a12386f0d7c3210cb64ee176bb082a9b11ed97b0455"} Nov 27 11:45:51 crc kubenswrapper[4807]: I1127 11:45:51.779480 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c"} Nov 27 11:45:51 crc kubenswrapper[4807]: I1127 11:45:51.779530 4807 scope.go:117] "RemoveContainer" containerID="dfd7d82a9d54402e8fb4ab20b796943311a83d97a94ade4917650a47cc2f5b88" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.722151 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9nm7l"] Nov 27 11:46:26 crc kubenswrapper[4807]: E1127 11:46:26.723161 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerName="extract-utilities" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.723180 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerName="extract-utilities" Nov 27 11:46:26 crc kubenswrapper[4807]: E1127 11:46:26.723216 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerName="extract-content" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.723224 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerName="extract-content" Nov 27 11:46:26 crc kubenswrapper[4807]: E1127 11:46:26.723235 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerName="registry-server" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.723261 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerName="registry-server" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.723558 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac809f4a-2962-4d2e-acc5-ee39a20bb86e" containerName="registry-server" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.725167 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.732850 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nm7l"] Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.884803 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvgpl\" (UniqueName: \"kubernetes.io/projected/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-kube-api-access-qvgpl\") pod \"redhat-marketplace-9nm7l\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.884871 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-catalog-content\") pod \"redhat-marketplace-9nm7l\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.885126 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-utilities\") pod \"redhat-marketplace-9nm7l\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.986770 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-catalog-content\") pod \"redhat-marketplace-9nm7l\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.987111 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-utilities\") pod \"redhat-marketplace-9nm7l\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.987209 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvgpl\" (UniqueName: \"kubernetes.io/projected/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-kube-api-access-qvgpl\") pod \"redhat-marketplace-9nm7l\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.987355 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-catalog-content\") pod \"redhat-marketplace-9nm7l\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:26 crc kubenswrapper[4807]: I1127 11:46:26.987522 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-utilities\") pod \"redhat-marketplace-9nm7l\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:27 crc kubenswrapper[4807]: I1127 11:46:27.018195 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvgpl\" (UniqueName: \"kubernetes.io/projected/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-kube-api-access-qvgpl\") pod \"redhat-marketplace-9nm7l\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:27 crc kubenswrapper[4807]: I1127 11:46:27.057421 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:27 crc kubenswrapper[4807]: I1127 11:46:27.543188 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nm7l"] Nov 27 11:46:28 crc kubenswrapper[4807]: I1127 11:46:28.084139 4807 generic.go:334] "Generic (PLEG): container finished" podID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerID="b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b" exitCode=0 Nov 27 11:46:28 crc kubenswrapper[4807]: I1127 11:46:28.084238 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nm7l" event={"ID":"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c","Type":"ContainerDied","Data":"b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b"} Nov 27 11:46:28 crc kubenswrapper[4807]: I1127 11:46:28.084432 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nm7l" event={"ID":"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c","Type":"ContainerStarted","Data":"d1cdc8751ab15ad85a46bbae70c455a12eb4683a3914568c679f05f1e413352d"} Nov 27 11:46:28 crc kubenswrapper[4807]: I1127 11:46:28.086431 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 11:46:30 crc kubenswrapper[4807]: I1127 11:46:30.110781 4807 generic.go:334] "Generic (PLEG): container finished" podID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerID="333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de" exitCode=0 Nov 27 11:46:30 crc kubenswrapper[4807]: I1127 11:46:30.110867 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nm7l" event={"ID":"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c","Type":"ContainerDied","Data":"333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de"} Nov 27 11:46:31 crc kubenswrapper[4807]: I1127 11:46:31.123803 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nm7l" event={"ID":"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c","Type":"ContainerStarted","Data":"6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017"} Nov 27 11:46:31 crc kubenswrapper[4807]: I1127 11:46:31.164922 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9nm7l" podStartSLOduration=2.677114321 podStartE2EDuration="5.164900458s" podCreationTimestamp="2025-11-27 11:46:26 +0000 UTC" firstStartedPulling="2025-11-27 11:46:28.086212164 +0000 UTC m=+2229.185710372" lastFinishedPulling="2025-11-27 11:46:30.573998311 +0000 UTC m=+2231.673496509" observedRunningTime="2025-11-27 11:46:31.140399499 +0000 UTC m=+2232.239897697" watchObservedRunningTime="2025-11-27 11:46:31.164900458 +0000 UTC m=+2232.264398656" Nov 27 11:46:37 crc kubenswrapper[4807]: I1127 11:46:37.059062 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:37 crc kubenswrapper[4807]: I1127 11:46:37.059472 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:37 crc kubenswrapper[4807]: I1127 11:46:37.106731 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:37 crc kubenswrapper[4807]: I1127 11:46:37.242648 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:37 crc kubenswrapper[4807]: I1127 11:46:37.345440 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nm7l"] Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.199862 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9nm7l" podUID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerName="registry-server" containerID="cri-o://6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017" gracePeriod=2 Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.659885 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.753833 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-utilities\") pod \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.754187 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-catalog-content\") pod \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.754887 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-utilities" (OuterVolumeSpecName: "utilities") pod "9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" (UID: "9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.755633 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvgpl\" (UniqueName: \"kubernetes.io/projected/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-kube-api-access-qvgpl\") pod \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\" (UID: \"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c\") " Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.756308 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.770874 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" (UID: "9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.776677 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-kube-api-access-qvgpl" (OuterVolumeSpecName: "kube-api-access-qvgpl") pod "9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" (UID: "9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c"). InnerVolumeSpecName "kube-api-access-qvgpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.858047 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:46:39 crc kubenswrapper[4807]: I1127 11:46:39.858075 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvgpl\" (UniqueName: \"kubernetes.io/projected/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c-kube-api-access-qvgpl\") on node \"crc\" DevicePath \"\"" Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.213844 4807 generic.go:334] "Generic (PLEG): container finished" podID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerID="6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017" exitCode=0 Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.213911 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nm7l" event={"ID":"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c","Type":"ContainerDied","Data":"6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017"} Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.214005 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nm7l" event={"ID":"9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c","Type":"ContainerDied","Data":"d1cdc8751ab15ad85a46bbae70c455a12eb4683a3914568c679f05f1e413352d"} Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.213939 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nm7l" Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.214037 4807 scope.go:117] "RemoveContainer" containerID="6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017" Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.239671 4807 scope.go:117] "RemoveContainer" containerID="333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de" Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.266328 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nm7l"] Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.270230 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nm7l"] Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.284274 4807 scope.go:117] "RemoveContainer" containerID="b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b" Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.339166 4807 scope.go:117] "RemoveContainer" containerID="6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017" Nov 27 11:46:40 crc kubenswrapper[4807]: E1127 11:46:40.339655 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017\": container with ID starting with 6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017 not found: ID does not exist" containerID="6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017" Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.339701 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017"} err="failed to get container status \"6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017\": rpc error: code = NotFound desc = could not find container \"6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017\": container with ID starting with 6a1ccdbc7d54d9b9cca3cfdce973da7d14d6e0d995d015cec19a9d643b500017 not found: ID does not exist" Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.339732 4807 scope.go:117] "RemoveContainer" containerID="333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de" Nov 27 11:46:40 crc kubenswrapper[4807]: E1127 11:46:40.340300 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de\": container with ID starting with 333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de not found: ID does not exist" containerID="333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de" Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.340353 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de"} err="failed to get container status \"333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de\": rpc error: code = NotFound desc = could not find container \"333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de\": container with ID starting with 333f9f658c041ed40ffe8b76e20a04088423ac592b963e22c88dfc19571672de not found: ID does not exist" Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.340375 4807 scope.go:117] "RemoveContainer" containerID="b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b" Nov 27 11:46:40 crc kubenswrapper[4807]: E1127 11:46:40.340637 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b\": container with ID starting with b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b not found: ID does not exist" containerID="b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b" Nov 27 11:46:40 crc kubenswrapper[4807]: I1127 11:46:40.340688 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b"} err="failed to get container status \"b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b\": rpc error: code = NotFound desc = could not find container \"b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b\": container with ID starting with b9dfd9e6c3fb3de0a01248574440b063483750a894e6fd2d084bb8c769e1835b not found: ID does not exist" Nov 27 11:46:41 crc kubenswrapper[4807]: I1127 11:46:41.545775 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" path="/var/lib/kubelet/pods/9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c/volumes" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.211833 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-khs6m"] Nov 27 11:47:06 crc kubenswrapper[4807]: E1127 11:47:06.212805 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerName="extract-utilities" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.212819 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerName="extract-utilities" Nov 27 11:47:06 crc kubenswrapper[4807]: E1127 11:47:06.212841 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerName="extract-content" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.212849 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerName="extract-content" Nov 27 11:47:06 crc kubenswrapper[4807]: E1127 11:47:06.212856 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerName="registry-server" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.212865 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerName="registry-server" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.213124 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a3bcc78-0e45-4a57-9a8a-0d2f3e6e715c" containerName="registry-server" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.214714 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.243836 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-khs6m"] Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.261658 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4kp4\" (UniqueName: \"kubernetes.io/projected/6f6e8822-4690-4e8f-89c3-dd9cd216d769-kube-api-access-n4kp4\") pod \"certified-operators-khs6m\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.261735 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-catalog-content\") pod \"certified-operators-khs6m\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.262006 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-utilities\") pod \"certified-operators-khs6m\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.364340 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-catalog-content\") pod \"certified-operators-khs6m\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.364453 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-utilities\") pod \"certified-operators-khs6m\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.364547 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4kp4\" (UniqueName: \"kubernetes.io/projected/6f6e8822-4690-4e8f-89c3-dd9cd216d769-kube-api-access-n4kp4\") pod \"certified-operators-khs6m\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.365032 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-catalog-content\") pod \"certified-operators-khs6m\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.365042 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-utilities\") pod \"certified-operators-khs6m\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.385882 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4kp4\" (UniqueName: \"kubernetes.io/projected/6f6e8822-4690-4e8f-89c3-dd9cd216d769-kube-api-access-n4kp4\") pod \"certified-operators-khs6m\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:06 crc kubenswrapper[4807]: I1127 11:47:06.544444 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:07 crc kubenswrapper[4807]: I1127 11:47:07.066926 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-khs6m"] Nov 27 11:47:07 crc kubenswrapper[4807]: I1127 11:47:07.467109 4807 generic.go:334] "Generic (PLEG): container finished" podID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerID="e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617" exitCode=0 Nov 27 11:47:07 crc kubenswrapper[4807]: I1127 11:47:07.467169 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khs6m" event={"ID":"6f6e8822-4690-4e8f-89c3-dd9cd216d769","Type":"ContainerDied","Data":"e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617"} Nov 27 11:47:07 crc kubenswrapper[4807]: I1127 11:47:07.467208 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khs6m" event={"ID":"6f6e8822-4690-4e8f-89c3-dd9cd216d769","Type":"ContainerStarted","Data":"8cdfb48ba19e55bbc5257d8d685eb3cbc0179c1137d3b83f51ea7116cd1abe00"} Nov 27 11:47:09 crc kubenswrapper[4807]: I1127 11:47:09.490566 4807 generic.go:334] "Generic (PLEG): container finished" podID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerID="39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e" exitCode=0 Nov 27 11:47:09 crc kubenswrapper[4807]: I1127 11:47:09.490622 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khs6m" event={"ID":"6f6e8822-4690-4e8f-89c3-dd9cd216d769","Type":"ContainerDied","Data":"39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e"} Nov 27 11:47:10 crc kubenswrapper[4807]: I1127 11:47:10.500594 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khs6m" event={"ID":"6f6e8822-4690-4e8f-89c3-dd9cd216d769","Type":"ContainerStarted","Data":"9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051"} Nov 27 11:47:10 crc kubenswrapper[4807]: I1127 11:47:10.524393 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-khs6m" podStartSLOduration=1.879641033 podStartE2EDuration="4.524373089s" podCreationTimestamp="2025-11-27 11:47:06 +0000 UTC" firstStartedPulling="2025-11-27 11:47:07.470285183 +0000 UTC m=+2268.569783411" lastFinishedPulling="2025-11-27 11:47:10.115017269 +0000 UTC m=+2271.214515467" observedRunningTime="2025-11-27 11:47:10.516435408 +0000 UTC m=+2271.615933616" watchObservedRunningTime="2025-11-27 11:47:10.524373089 +0000 UTC m=+2271.623871287" Nov 27 11:47:16 crc kubenswrapper[4807]: I1127 11:47:16.545504 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:16 crc kubenswrapper[4807]: I1127 11:47:16.547647 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:16 crc kubenswrapper[4807]: I1127 11:47:16.602616 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:17 crc kubenswrapper[4807]: I1127 11:47:17.618299 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:18 crc kubenswrapper[4807]: I1127 11:47:18.325297 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-khs6m"] Nov 27 11:47:19 crc kubenswrapper[4807]: I1127 11:47:19.583144 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-khs6m" podUID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerName="registry-server" containerID="cri-o://9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051" gracePeriod=2 Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.083285 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.169544 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-utilities\") pod \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.169634 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4kp4\" (UniqueName: \"kubernetes.io/projected/6f6e8822-4690-4e8f-89c3-dd9cd216d769-kube-api-access-n4kp4\") pod \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.169868 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-catalog-content\") pod \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\" (UID: \"6f6e8822-4690-4e8f-89c3-dd9cd216d769\") " Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.172080 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-utilities" (OuterVolumeSpecName: "utilities") pod "6f6e8822-4690-4e8f-89c3-dd9cd216d769" (UID: "6f6e8822-4690-4e8f-89c3-dd9cd216d769"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.175200 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f6e8822-4690-4e8f-89c3-dd9cd216d769-kube-api-access-n4kp4" (OuterVolumeSpecName: "kube-api-access-n4kp4") pod "6f6e8822-4690-4e8f-89c3-dd9cd216d769" (UID: "6f6e8822-4690-4e8f-89c3-dd9cd216d769"). InnerVolumeSpecName "kube-api-access-n4kp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.224870 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f6e8822-4690-4e8f-89c3-dd9cd216d769" (UID: "6f6e8822-4690-4e8f-89c3-dd9cd216d769"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.272323 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.272582 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f6e8822-4690-4e8f-89c3-dd9cd216d769-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.272595 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4kp4\" (UniqueName: \"kubernetes.io/projected/6f6e8822-4690-4e8f-89c3-dd9cd216d769-kube-api-access-n4kp4\") on node \"crc\" DevicePath \"\"" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.597866 4807 generic.go:334] "Generic (PLEG): container finished" podID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerID="9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051" exitCode=0 Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.597905 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khs6m" event={"ID":"6f6e8822-4690-4e8f-89c3-dd9cd216d769","Type":"ContainerDied","Data":"9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051"} Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.597935 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khs6m" event={"ID":"6f6e8822-4690-4e8f-89c3-dd9cd216d769","Type":"ContainerDied","Data":"8cdfb48ba19e55bbc5257d8d685eb3cbc0179c1137d3b83f51ea7116cd1abe00"} Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.597954 4807 scope.go:117] "RemoveContainer" containerID="9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.598100 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khs6m" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.640121 4807 scope.go:117] "RemoveContainer" containerID="39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.650158 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-khs6m"] Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.657353 4807 scope.go:117] "RemoveContainer" containerID="e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.659167 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-khs6m"] Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.735683 4807 scope.go:117] "RemoveContainer" containerID="9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051" Nov 27 11:47:20 crc kubenswrapper[4807]: E1127 11:47:20.736093 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051\": container with ID starting with 9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051 not found: ID does not exist" containerID="9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.736146 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051"} err="failed to get container status \"9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051\": rpc error: code = NotFound desc = could not find container \"9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051\": container with ID starting with 9791123b3f1e34ae6f57f66ea508d69f5d0101d7cc4506afebb70c8c98617051 not found: ID does not exist" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.736170 4807 scope.go:117] "RemoveContainer" containerID="39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e" Nov 27 11:47:20 crc kubenswrapper[4807]: E1127 11:47:20.736496 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e\": container with ID starting with 39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e not found: ID does not exist" containerID="39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.736523 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e"} err="failed to get container status \"39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e\": rpc error: code = NotFound desc = could not find container \"39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e\": container with ID starting with 39ea2abfa742477746e766b653a80505c9da8ea833ca7716cc07beea70d77f1e not found: ID does not exist" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.736540 4807 scope.go:117] "RemoveContainer" containerID="e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617" Nov 27 11:47:20 crc kubenswrapper[4807]: E1127 11:47:20.736688 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617\": container with ID starting with e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617 not found: ID does not exist" containerID="e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617" Nov 27 11:47:20 crc kubenswrapper[4807]: I1127 11:47:20.736730 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617"} err="failed to get container status \"e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617\": rpc error: code = NotFound desc = could not find container \"e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617\": container with ID starting with e55efca18713e5ce4e4a751e5c330206a11bf54961ed533ac05fe1c0bf185617 not found: ID does not exist" Nov 27 11:47:21 crc kubenswrapper[4807]: I1127 11:47:21.542214 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" path="/var/lib/kubelet/pods/6f6e8822-4690-4e8f-89c3-dd9cd216d769/volumes" Nov 27 11:48:17 crc kubenswrapper[4807]: I1127 11:48:17.098497 4807 generic.go:334] "Generic (PLEG): container finished" podID="08c2cd76-cfdb-4de6-ac04-8925b75415fa" containerID="e992476a320c33b35c44c05376dfc3949a07906a7074ab32f84c91c4060238e6" exitCode=0 Nov 27 11:48:17 crc kubenswrapper[4807]: I1127 11:48:17.098581 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" event={"ID":"08c2cd76-cfdb-4de6-ac04-8925b75415fa","Type":"ContainerDied","Data":"e992476a320c33b35c44c05376dfc3949a07906a7074ab32f84c91c4060238e6"} Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.490968 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.665275 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-ssh-key\") pod \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.665363 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62kl5\" (UniqueName: \"kubernetes.io/projected/08c2cd76-cfdb-4de6-ac04-8925b75415fa-kube-api-access-62kl5\") pod \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.665436 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-1\") pod \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.665496 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-inventory\") pod \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.665598 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-1\") pod \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.665670 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-extra-config-0\") pod \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.665713 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-combined-ca-bundle\") pod \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.665757 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-0\") pod \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.665787 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-0\") pod \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\" (UID: \"08c2cd76-cfdb-4de6-ac04-8925b75415fa\") " Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.671411 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c2cd76-cfdb-4de6-ac04-8925b75415fa-kube-api-access-62kl5" (OuterVolumeSpecName: "kube-api-access-62kl5") pod "08c2cd76-cfdb-4de6-ac04-8925b75415fa" (UID: "08c2cd76-cfdb-4de6-ac04-8925b75415fa"). InnerVolumeSpecName "kube-api-access-62kl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.671937 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "08c2cd76-cfdb-4de6-ac04-8925b75415fa" (UID: "08c2cd76-cfdb-4de6-ac04-8925b75415fa"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.692923 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "08c2cd76-cfdb-4de6-ac04-8925b75415fa" (UID: "08c2cd76-cfdb-4de6-ac04-8925b75415fa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.693907 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "08c2cd76-cfdb-4de6-ac04-8925b75415fa" (UID: "08c2cd76-cfdb-4de6-ac04-8925b75415fa"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.696105 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "08c2cd76-cfdb-4de6-ac04-8925b75415fa" (UID: "08c2cd76-cfdb-4de6-ac04-8925b75415fa"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.696424 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-inventory" (OuterVolumeSpecName: "inventory") pod "08c2cd76-cfdb-4de6-ac04-8925b75415fa" (UID: "08c2cd76-cfdb-4de6-ac04-8925b75415fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.699452 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "08c2cd76-cfdb-4de6-ac04-8925b75415fa" (UID: "08c2cd76-cfdb-4de6-ac04-8925b75415fa"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.705363 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "08c2cd76-cfdb-4de6-ac04-8925b75415fa" (UID: "08c2cd76-cfdb-4de6-ac04-8925b75415fa"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.707498 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "08c2cd76-cfdb-4de6-ac04-8925b75415fa" (UID: "08c2cd76-cfdb-4de6-ac04-8925b75415fa"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.768049 4807 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.768092 4807 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.768104 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.768116 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62kl5\" (UniqueName: \"kubernetes.io/projected/08c2cd76-cfdb-4de6-ac04-8925b75415fa-kube-api-access-62kl5\") on node \"crc\" DevicePath \"\"" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.768127 4807 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.768137 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.768150 4807 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.768162 4807 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:48:18 crc kubenswrapper[4807]: I1127 11:48:18.768172 4807 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c2cd76-cfdb-4de6-ac04-8925b75415fa-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.117368 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" event={"ID":"08c2cd76-cfdb-4de6-ac04-8925b75415fa","Type":"ContainerDied","Data":"e31c221a01dcfa34baceb661f098a23099b3673b6abd13a801fd59b355fe10bd"} Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.117409 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e31c221a01dcfa34baceb661f098a23099b3673b6abd13a801fd59b355fe10bd" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.117440 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-xnswz" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.219136 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh"] Nov 27 11:48:19 crc kubenswrapper[4807]: E1127 11:48:19.219512 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerName="extract-content" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.219534 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerName="extract-content" Nov 27 11:48:19 crc kubenswrapper[4807]: E1127 11:48:19.219549 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerName="registry-server" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.219559 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerName="registry-server" Nov 27 11:48:19 crc kubenswrapper[4807]: E1127 11:48:19.219575 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerName="extract-utilities" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.219581 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerName="extract-utilities" Nov 27 11:48:19 crc kubenswrapper[4807]: E1127 11:48:19.219596 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c2cd76-cfdb-4de6-ac04-8925b75415fa" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.219603 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c2cd76-cfdb-4de6-ac04-8925b75415fa" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.219762 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c2cd76-cfdb-4de6-ac04-8925b75415fa" containerName="nova-edpm-deployment-openstack-edpm-ipam" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.219791 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f6e8822-4690-4e8f-89c3-dd9cd216d769" containerName="registry-server" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.220554 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.223266 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.223999 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.224002 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.224125 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.224441 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.230768 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh"] Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.378056 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96pxs\" (UniqueName: \"kubernetes.io/projected/8a22c7d6-438a-499e-80d0-384ea7d2ec15-kube-api-access-96pxs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.378111 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.378144 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.378172 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.378268 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.378340 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.378388 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.479320 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.479388 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.479477 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96pxs\" (UniqueName: \"kubernetes.io/projected/8a22c7d6-438a-499e-80d0-384ea7d2ec15-kube-api-access-96pxs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.479509 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.479538 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.479564 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.479633 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.484903 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.485031 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.485029 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.485507 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.485665 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.485758 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.501986 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96pxs\" (UniqueName: \"kubernetes.io/projected/8a22c7d6-438a-499e-80d0-384ea7d2ec15-kube-api-access-96pxs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.538896 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zqmcw" Nov 27 11:48:19 crc kubenswrapper[4807]: I1127 11:48:19.547609 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:48:20 crc kubenswrapper[4807]: I1127 11:48:20.053902 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh"] Nov 27 11:48:20 crc kubenswrapper[4807]: I1127 11:48:20.126507 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" event={"ID":"8a22c7d6-438a-499e-80d0-384ea7d2ec15","Type":"ContainerStarted","Data":"af9f532bee2a474909bd07682028e7174bf472e22035ec530d2d5cd25b5c5067"} Nov 27 11:48:20 crc kubenswrapper[4807]: I1127 11:48:20.921982 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:48:20 crc kubenswrapper[4807]: I1127 11:48:20.922055 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:48:20 crc kubenswrapper[4807]: I1127 11:48:20.973465 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 27 11:48:22 crc kubenswrapper[4807]: I1127 11:48:22.143069 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" event={"ID":"8a22c7d6-438a-499e-80d0-384ea7d2ec15","Type":"ContainerStarted","Data":"2b0201a1520387025623eda936326301ad28f35567b760c920953f68fe1a030f"} Nov 27 11:48:22 crc kubenswrapper[4807]: I1127 11:48:22.167316 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" podStartSLOduration=2.259532294 podStartE2EDuration="3.167229361s" podCreationTimestamp="2025-11-27 11:48:19 +0000 UTC" firstStartedPulling="2025-11-27 11:48:20.063466205 +0000 UTC m=+2341.162964403" lastFinishedPulling="2025-11-27 11:48:20.971163272 +0000 UTC m=+2342.070661470" observedRunningTime="2025-11-27 11:48:22.160944933 +0000 UTC m=+2343.260443131" watchObservedRunningTime="2025-11-27 11:48:22.167229361 +0000 UTC m=+2343.266727559" Nov 27 11:48:50 crc kubenswrapper[4807]: I1127 11:48:50.921485 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:48:50 crc kubenswrapper[4807]: I1127 11:48:50.923331 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:49:20 crc kubenswrapper[4807]: I1127 11:49:20.921619 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:49:20 crc kubenswrapper[4807]: I1127 11:49:20.922151 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:49:20 crc kubenswrapper[4807]: I1127 11:49:20.922201 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:49:20 crc kubenswrapper[4807]: I1127 11:49:20.922986 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:49:20 crc kubenswrapper[4807]: I1127 11:49:20.923050 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" gracePeriod=600 Nov 27 11:49:21 crc kubenswrapper[4807]: E1127 11:49:21.050900 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:49:21 crc kubenswrapper[4807]: I1127 11:49:21.633515 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" exitCode=0 Nov 27 11:49:21 crc kubenswrapper[4807]: I1127 11:49:21.633609 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c"} Nov 27 11:49:21 crc kubenswrapper[4807]: I1127 11:49:21.633845 4807 scope.go:117] "RemoveContainer" containerID="912ea98b94c351525e710a12386f0d7c3210cb64ee176bb082a9b11ed97b0455" Nov 27 11:49:21 crc kubenswrapper[4807]: I1127 11:49:21.634746 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:49:21 crc kubenswrapper[4807]: E1127 11:49:21.635188 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:49:32 crc kubenswrapper[4807]: I1127 11:49:32.533375 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:49:32 crc kubenswrapper[4807]: E1127 11:49:32.534520 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:49:44 crc kubenswrapper[4807]: I1127 11:49:44.532811 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:49:44 crc kubenswrapper[4807]: E1127 11:49:44.533570 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:49:59 crc kubenswrapper[4807]: I1127 11:49:59.540581 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:49:59 crc kubenswrapper[4807]: E1127 11:49:59.541529 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:50:12 crc kubenswrapper[4807]: I1127 11:50:12.533101 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:50:12 crc kubenswrapper[4807]: E1127 11:50:12.533957 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:50:23 crc kubenswrapper[4807]: I1127 11:50:23.533710 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:50:23 crc kubenswrapper[4807]: E1127 11:50:23.534646 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:50:26 crc kubenswrapper[4807]: I1127 11:50:26.179520 4807 generic.go:334] "Generic (PLEG): container finished" podID="8a22c7d6-438a-499e-80d0-384ea7d2ec15" containerID="2b0201a1520387025623eda936326301ad28f35567b760c920953f68fe1a030f" exitCode=0 Nov 27 11:50:26 crc kubenswrapper[4807]: I1127 11:50:26.179596 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" event={"ID":"8a22c7d6-438a-499e-80d0-384ea7d2ec15","Type":"ContainerDied","Data":"2b0201a1520387025623eda936326301ad28f35567b760c920953f68fe1a030f"} Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.567989 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.648707 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-telemetry-combined-ca-bundle\") pod \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.648774 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ssh-key\") pod \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.648812 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96pxs\" (UniqueName: \"kubernetes.io/projected/8a22c7d6-438a-499e-80d0-384ea7d2ec15-kube-api-access-96pxs\") pod \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.648829 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-inventory\") pod \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.648853 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-0\") pod \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.648945 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-2\") pod \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.648970 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-1\") pod \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\" (UID: \"8a22c7d6-438a-499e-80d0-384ea7d2ec15\") " Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.655024 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8a22c7d6-438a-499e-80d0-384ea7d2ec15" (UID: "8a22c7d6-438a-499e-80d0-384ea7d2ec15"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.655613 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a22c7d6-438a-499e-80d0-384ea7d2ec15-kube-api-access-96pxs" (OuterVolumeSpecName: "kube-api-access-96pxs") pod "8a22c7d6-438a-499e-80d0-384ea7d2ec15" (UID: "8a22c7d6-438a-499e-80d0-384ea7d2ec15"). InnerVolumeSpecName "kube-api-access-96pxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.676387 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "8a22c7d6-438a-499e-80d0-384ea7d2ec15" (UID: "8a22c7d6-438a-499e-80d0-384ea7d2ec15"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.677843 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-inventory" (OuterVolumeSpecName: "inventory") pod "8a22c7d6-438a-499e-80d0-384ea7d2ec15" (UID: "8a22c7d6-438a-499e-80d0-384ea7d2ec15"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.682580 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "8a22c7d6-438a-499e-80d0-384ea7d2ec15" (UID: "8a22c7d6-438a-499e-80d0-384ea7d2ec15"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.683883 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8a22c7d6-438a-499e-80d0-384ea7d2ec15" (UID: "8a22c7d6-438a-499e-80d0-384ea7d2ec15"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.684417 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "8a22c7d6-438a-499e-80d0-384ea7d2ec15" (UID: "8a22c7d6-438a-499e-80d0-384ea7d2ec15"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.751259 4807 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.751292 4807 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.751304 4807 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.751313 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.751323 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96pxs\" (UniqueName: \"kubernetes.io/projected/8a22c7d6-438a-499e-80d0-384ea7d2ec15-kube-api-access-96pxs\") on node \"crc\" DevicePath \"\"" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.751334 4807 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-inventory\") on node \"crc\" DevicePath \"\"" Nov 27 11:50:27 crc kubenswrapper[4807]: I1127 11:50:27.751343 4807 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8a22c7d6-438a-499e-80d0-384ea7d2ec15-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 27 11:50:28 crc kubenswrapper[4807]: I1127 11:50:28.196048 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" event={"ID":"8a22c7d6-438a-499e-80d0-384ea7d2ec15","Type":"ContainerDied","Data":"af9f532bee2a474909bd07682028e7174bf472e22035ec530d2d5cd25b5c5067"} Nov 27 11:50:28 crc kubenswrapper[4807]: I1127 11:50:28.196093 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9f532bee2a474909bd07682028e7174bf472e22035ec530d2d5cd25b5c5067" Nov 27 11:50:28 crc kubenswrapper[4807]: I1127 11:50:28.196115 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh" Nov 27 11:50:36 crc kubenswrapper[4807]: I1127 11:50:36.532778 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:50:36 crc kubenswrapper[4807]: E1127 11:50:36.533939 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:50:48 crc kubenswrapper[4807]: I1127 11:50:48.532629 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:50:48 crc kubenswrapper[4807]: E1127 11:50:48.533379 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:51:01 crc kubenswrapper[4807]: I1127 11:51:01.532998 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:51:01 crc kubenswrapper[4807]: E1127 11:51:01.533803 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.891314 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Nov 27 11:51:08 crc kubenswrapper[4807]: E1127 11:51:08.892361 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a22c7d6-438a-499e-80d0-384ea7d2ec15" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.892383 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a22c7d6-438a-499e-80d0-384ea7d2ec15" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.892601 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a22c7d6-438a-499e-80d0-384ea7d2ec15" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.893287 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.895051 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.896162 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-44p59" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.896221 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.896298 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.906045 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.956975 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.957027 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.957064 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.957088 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trp2n\" (UniqueName: \"kubernetes.io/projected/203f3a06-5cde-4778-837a-90fbfde39772-kube-api-access-trp2n\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.957156 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.957213 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-config-data\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.957275 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.957444 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:08 crc kubenswrapper[4807]: I1127 11:51:08.957552 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.059437 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.059473 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.059496 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.059511 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trp2n\" (UniqueName: \"kubernetes.io/projected/203f3a06-5cde-4778-837a-90fbfde39772-kube-api-access-trp2n\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.059560 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.059628 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-config-data\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.059645 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.059692 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.059731 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.060546 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.060582 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.060951 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-config-data\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.061550 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.061585 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.067747 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.075539 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.078092 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.079930 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trp2n\" (UniqueName: \"kubernetes.io/projected/203f3a06-5cde-4778-837a-90fbfde39772-kube-api-access-trp2n\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.096526 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.221780 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 27 11:51:09 crc kubenswrapper[4807]: I1127 11:51:09.643449 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Nov 27 11:51:10 crc kubenswrapper[4807]: I1127 11:51:10.612398 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"203f3a06-5cde-4778-837a-90fbfde39772","Type":"ContainerStarted","Data":"e8cc14f86d11eb7fb50857bafef97e3e2e4ed9560682114efd1257cf09065d17"} Nov 27 11:51:14 crc kubenswrapper[4807]: I1127 11:51:14.532780 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:51:14 crc kubenswrapper[4807]: E1127 11:51:14.533354 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:51:26 crc kubenswrapper[4807]: I1127 11:51:26.532186 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:51:26 crc kubenswrapper[4807]: E1127 11:51:26.532993 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:51:40 crc kubenswrapper[4807]: I1127 11:51:40.532632 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:51:40 crc kubenswrapper[4807]: E1127 11:51:40.533569 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:51:42 crc kubenswrapper[4807]: E1127 11:51:42.716167 4807 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Nov 27 11:51:42 crc kubenswrapper[4807]: E1127 11:51:42.716735 4807 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-trp2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(203f3a06-5cde-4778-837a-90fbfde39772): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 27 11:51:42 crc kubenswrapper[4807]: E1127 11:51:42.718034 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="203f3a06-5cde-4778-837a-90fbfde39772" Nov 27 11:51:42 crc kubenswrapper[4807]: E1127 11:51:42.916895 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="203f3a06-5cde-4778-837a-90fbfde39772" Nov 27 11:51:54 crc kubenswrapper[4807]: I1127 11:51:54.532368 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:51:54 crc kubenswrapper[4807]: E1127 11:51:54.533003 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:51:55 crc kubenswrapper[4807]: I1127 11:51:55.534934 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 11:51:56 crc kubenswrapper[4807]: I1127 11:51:56.343624 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Nov 27 11:51:58 crc kubenswrapper[4807]: I1127 11:51:58.043728 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"203f3a06-5cde-4778-837a-90fbfde39772","Type":"ContainerStarted","Data":"6bbc4d00fd9c3ece20dbdeda954bbdf304df4f640dde36c2bff58a77c17df317"} Nov 27 11:52:05 crc kubenswrapper[4807]: I1127 11:52:05.533384 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:52:05 crc kubenswrapper[4807]: E1127 11:52:05.534260 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:52:18 crc kubenswrapper[4807]: I1127 11:52:18.531917 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:52:18 crc kubenswrapper[4807]: E1127 11:52:18.532696 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:52:31 crc kubenswrapper[4807]: I1127 11:52:31.544083 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:52:31 crc kubenswrapper[4807]: E1127 11:52:31.544735 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:52:45 crc kubenswrapper[4807]: I1127 11:52:45.532978 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:52:45 crc kubenswrapper[4807]: E1127 11:52:45.534645 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:52:58 crc kubenswrapper[4807]: I1127 11:52:58.533123 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:52:58 crc kubenswrapper[4807]: E1127 11:52:58.533964 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:53:10 crc kubenswrapper[4807]: I1127 11:53:10.532675 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:53:10 crc kubenswrapper[4807]: E1127 11:53:10.533377 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:53:25 crc kubenswrapper[4807]: I1127 11:53:25.532787 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:53:25 crc kubenswrapper[4807]: E1127 11:53:25.534301 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:53:40 crc kubenswrapper[4807]: I1127 11:53:40.532177 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:53:40 crc kubenswrapper[4807]: E1127 11:53:40.532933 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:53:53 crc kubenswrapper[4807]: I1127 11:53:53.532384 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:53:53 crc kubenswrapper[4807]: E1127 11:53:53.534585 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:54:04 crc kubenswrapper[4807]: I1127 11:54:04.533032 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:54:04 crc kubenswrapper[4807]: E1127 11:54:04.533750 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:54:19 crc kubenswrapper[4807]: I1127 11:54:19.542381 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:54:19 crc kubenswrapper[4807]: E1127 11:54:19.542956 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 11:54:30 crc kubenswrapper[4807]: I1127 11:54:30.532668 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:54:31 crc kubenswrapper[4807]: I1127 11:54:31.397611 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"e1776cb085c8567a86319844c5502cf85aafc513996684e3735a9a2fa46d2137"} Nov 27 11:54:31 crc kubenswrapper[4807]: I1127 11:54:31.414614 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=157.735329815 podStartE2EDuration="3m24.41459323s" podCreationTimestamp="2025-11-27 11:51:07 +0000 UTC" firstStartedPulling="2025-11-27 11:51:09.662125094 +0000 UTC m=+2510.761623282" lastFinishedPulling="2025-11-27 11:51:56.341388499 +0000 UTC m=+2557.440886697" observedRunningTime="2025-11-27 11:51:58.068102631 +0000 UTC m=+2559.167600839" watchObservedRunningTime="2025-11-27 11:54:31.41459323 +0000 UTC m=+2712.514091428" Nov 27 11:55:11 crc kubenswrapper[4807]: I1127 11:55:11.850082 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-86rhg"] Nov 27 11:55:11 crc kubenswrapper[4807]: I1127 11:55:11.852459 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:11 crc kubenswrapper[4807]: I1127 11:55:11.862096 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86rhg"] Nov 27 11:55:11 crc kubenswrapper[4807]: I1127 11:55:11.907054 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-catalog-content\") pod \"redhat-operators-86rhg\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:11 crc kubenswrapper[4807]: I1127 11:55:11.907168 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-utilities\") pod \"redhat-operators-86rhg\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:11 crc kubenswrapper[4807]: I1127 11:55:11.907194 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljvw\" (UniqueName: \"kubernetes.io/projected/6528544b-4835-427d-97cc-60d2788844e0-kube-api-access-nljvw\") pod \"redhat-operators-86rhg\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:12 crc kubenswrapper[4807]: I1127 11:55:12.008697 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-utilities\") pod \"redhat-operators-86rhg\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:12 crc kubenswrapper[4807]: I1127 11:55:12.008748 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nljvw\" (UniqueName: \"kubernetes.io/projected/6528544b-4835-427d-97cc-60d2788844e0-kube-api-access-nljvw\") pod \"redhat-operators-86rhg\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:12 crc kubenswrapper[4807]: I1127 11:55:12.008844 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-catalog-content\") pod \"redhat-operators-86rhg\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:12 crc kubenswrapper[4807]: I1127 11:55:12.009233 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-utilities\") pod \"redhat-operators-86rhg\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:12 crc kubenswrapper[4807]: I1127 11:55:12.009284 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-catalog-content\") pod \"redhat-operators-86rhg\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:12 crc kubenswrapper[4807]: I1127 11:55:12.030546 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljvw\" (UniqueName: \"kubernetes.io/projected/6528544b-4835-427d-97cc-60d2788844e0-kube-api-access-nljvw\") pod \"redhat-operators-86rhg\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:12 crc kubenswrapper[4807]: I1127 11:55:12.198443 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:12 crc kubenswrapper[4807]: I1127 11:55:12.635076 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-86rhg"] Nov 27 11:55:12 crc kubenswrapper[4807]: I1127 11:55:12.802521 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86rhg" event={"ID":"6528544b-4835-427d-97cc-60d2788844e0","Type":"ContainerStarted","Data":"69270b66f9fee6710665eb4ff643258ec6a7bef1dba1702cf337fbedc8b39111"} Nov 27 11:55:13 crc kubenswrapper[4807]: I1127 11:55:13.811443 4807 generic.go:334] "Generic (PLEG): container finished" podID="6528544b-4835-427d-97cc-60d2788844e0" containerID="ca4ac6c041813c986513423820d103c99d2ab70bafccc8f57b12ee12676ae350" exitCode=0 Nov 27 11:55:13 crc kubenswrapper[4807]: I1127 11:55:13.811626 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86rhg" event={"ID":"6528544b-4835-427d-97cc-60d2788844e0","Type":"ContainerDied","Data":"ca4ac6c041813c986513423820d103c99d2ab70bafccc8f57b12ee12676ae350"} Nov 27 11:55:15 crc kubenswrapper[4807]: I1127 11:55:15.831287 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86rhg" event={"ID":"6528544b-4835-427d-97cc-60d2788844e0","Type":"ContainerStarted","Data":"fba5d6540786ee88c4a60ac853172ca298da435e572dd9c38925e7f01c93a03d"} Nov 27 11:55:17 crc kubenswrapper[4807]: I1127 11:55:17.847880 4807 generic.go:334] "Generic (PLEG): container finished" podID="6528544b-4835-427d-97cc-60d2788844e0" containerID="fba5d6540786ee88c4a60ac853172ca298da435e572dd9c38925e7f01c93a03d" exitCode=0 Nov 27 11:55:17 crc kubenswrapper[4807]: I1127 11:55:17.847945 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86rhg" event={"ID":"6528544b-4835-427d-97cc-60d2788844e0","Type":"ContainerDied","Data":"fba5d6540786ee88c4a60ac853172ca298da435e572dd9c38925e7f01c93a03d"} Nov 27 11:55:19 crc kubenswrapper[4807]: I1127 11:55:19.867753 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86rhg" event={"ID":"6528544b-4835-427d-97cc-60d2788844e0","Type":"ContainerStarted","Data":"0299a09bf9965e87ecf915d6e21132213c372d588d0a731e7376af831222325a"} Nov 27 11:55:19 crc kubenswrapper[4807]: I1127 11:55:19.894717 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-86rhg" podStartSLOduration=4.017057419 podStartE2EDuration="8.894698576s" podCreationTimestamp="2025-11-27 11:55:11 +0000 UTC" firstStartedPulling="2025-11-27 11:55:13.813509129 +0000 UTC m=+2754.913007327" lastFinishedPulling="2025-11-27 11:55:18.691150286 +0000 UTC m=+2759.790648484" observedRunningTime="2025-11-27 11:55:19.892399605 +0000 UTC m=+2760.991897823" watchObservedRunningTime="2025-11-27 11:55:19.894698576 +0000 UTC m=+2760.994196774" Nov 27 11:55:22 crc kubenswrapper[4807]: I1127 11:55:22.198828 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:22 crc kubenswrapper[4807]: I1127 11:55:22.199740 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:23 crc kubenswrapper[4807]: I1127 11:55:23.244526 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-86rhg" podUID="6528544b-4835-427d-97cc-60d2788844e0" containerName="registry-server" probeResult="failure" output=< Nov 27 11:55:23 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Nov 27 11:55:23 crc kubenswrapper[4807]: > Nov 27 11:55:32 crc kubenswrapper[4807]: I1127 11:55:32.276328 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:32 crc kubenswrapper[4807]: I1127 11:55:32.319388 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:32 crc kubenswrapper[4807]: I1127 11:55:32.510496 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86rhg"] Nov 27 11:55:33 crc kubenswrapper[4807]: I1127 11:55:33.986861 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-86rhg" podUID="6528544b-4835-427d-97cc-60d2788844e0" containerName="registry-server" containerID="cri-o://0299a09bf9965e87ecf915d6e21132213c372d588d0a731e7376af831222325a" gracePeriod=2 Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.010524 4807 generic.go:334] "Generic (PLEG): container finished" podID="6528544b-4835-427d-97cc-60d2788844e0" containerID="0299a09bf9965e87ecf915d6e21132213c372d588d0a731e7376af831222325a" exitCode=0 Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.010608 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86rhg" event={"ID":"6528544b-4835-427d-97cc-60d2788844e0","Type":"ContainerDied","Data":"0299a09bf9965e87ecf915d6e21132213c372d588d0a731e7376af831222325a"} Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.010943 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-86rhg" event={"ID":"6528544b-4835-427d-97cc-60d2788844e0","Type":"ContainerDied","Data":"69270b66f9fee6710665eb4ff643258ec6a7bef1dba1702cf337fbedc8b39111"} Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.010959 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69270b66f9fee6710665eb4ff643258ec6a7bef1dba1702cf337fbedc8b39111" Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.044735 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.200328 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-utilities\") pod \"6528544b-4835-427d-97cc-60d2788844e0\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.200460 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nljvw\" (UniqueName: \"kubernetes.io/projected/6528544b-4835-427d-97cc-60d2788844e0-kube-api-access-nljvw\") pod \"6528544b-4835-427d-97cc-60d2788844e0\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.200664 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-catalog-content\") pod \"6528544b-4835-427d-97cc-60d2788844e0\" (UID: \"6528544b-4835-427d-97cc-60d2788844e0\") " Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.202949 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-utilities" (OuterVolumeSpecName: "utilities") pod "6528544b-4835-427d-97cc-60d2788844e0" (UID: "6528544b-4835-427d-97cc-60d2788844e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.208285 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6528544b-4835-427d-97cc-60d2788844e0-kube-api-access-nljvw" (OuterVolumeSpecName: "kube-api-access-nljvw") pod "6528544b-4835-427d-97cc-60d2788844e0" (UID: "6528544b-4835-427d-97cc-60d2788844e0"). InnerVolumeSpecName "kube-api-access-nljvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.303178 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.303211 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nljvw\" (UniqueName: \"kubernetes.io/projected/6528544b-4835-427d-97cc-60d2788844e0-kube-api-access-nljvw\") on node \"crc\" DevicePath \"\"" Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.310446 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6528544b-4835-427d-97cc-60d2788844e0" (UID: "6528544b-4835-427d-97cc-60d2788844e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:55:35 crc kubenswrapper[4807]: I1127 11:55:35.404767 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6528544b-4835-427d-97cc-60d2788844e0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:55:36 crc kubenswrapper[4807]: I1127 11:55:36.020016 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-86rhg" Nov 27 11:55:36 crc kubenswrapper[4807]: I1127 11:55:36.049433 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-86rhg"] Nov 27 11:55:36 crc kubenswrapper[4807]: I1127 11:55:36.062964 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-86rhg"] Nov 27 11:55:37 crc kubenswrapper[4807]: I1127 11:55:37.546427 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6528544b-4835-427d-97cc-60d2788844e0" path="/var/lib/kubelet/pods/6528544b-4835-427d-97cc-60d2788844e0/volumes" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.234355 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xlgxm"] Nov 27 11:56:12 crc kubenswrapper[4807]: E1127 11:56:12.236491 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6528544b-4835-427d-97cc-60d2788844e0" containerName="registry-server" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.236607 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6528544b-4835-427d-97cc-60d2788844e0" containerName="registry-server" Nov 27 11:56:12 crc kubenswrapper[4807]: E1127 11:56:12.236706 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6528544b-4835-427d-97cc-60d2788844e0" containerName="extract-utilities" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.236792 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6528544b-4835-427d-97cc-60d2788844e0" containerName="extract-utilities" Nov 27 11:56:12 crc kubenswrapper[4807]: E1127 11:56:12.236907 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6528544b-4835-427d-97cc-60d2788844e0" containerName="extract-content" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.236998 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6528544b-4835-427d-97cc-60d2788844e0" containerName="extract-content" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.237425 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6528544b-4835-427d-97cc-60d2788844e0" containerName="registry-server" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.239134 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.248037 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xlgxm"] Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.391512 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-catalog-content\") pod \"community-operators-xlgxm\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.391595 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqs5g\" (UniqueName: \"kubernetes.io/projected/03500fd1-4bab-451f-8994-0732f2f92bfd-kube-api-access-fqs5g\") pod \"community-operators-xlgxm\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.391712 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-utilities\") pod \"community-operators-xlgxm\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.493913 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-utilities\") pod \"community-operators-xlgxm\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.494138 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-catalog-content\") pod \"community-operators-xlgxm\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.494229 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqs5g\" (UniqueName: \"kubernetes.io/projected/03500fd1-4bab-451f-8994-0732f2f92bfd-kube-api-access-fqs5g\") pod \"community-operators-xlgxm\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.494942 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-utilities\") pod \"community-operators-xlgxm\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.495027 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-catalog-content\") pod \"community-operators-xlgxm\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.521825 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqs5g\" (UniqueName: \"kubernetes.io/projected/03500fd1-4bab-451f-8994-0732f2f92bfd-kube-api-access-fqs5g\") pod \"community-operators-xlgxm\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:12 crc kubenswrapper[4807]: I1127 11:56:12.589997 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:13 crc kubenswrapper[4807]: I1127 11:56:13.095226 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xlgxm"] Nov 27 11:56:13 crc kubenswrapper[4807]: I1127 11:56:13.355679 4807 generic.go:334] "Generic (PLEG): container finished" podID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerID="96de1b624e678270f8f99cb3e316dc6916f4316436e5c45171507e00679982d7" exitCode=0 Nov 27 11:56:13 crc kubenswrapper[4807]: I1127 11:56:13.355726 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlgxm" event={"ID":"03500fd1-4bab-451f-8994-0732f2f92bfd","Type":"ContainerDied","Data":"96de1b624e678270f8f99cb3e316dc6916f4316436e5c45171507e00679982d7"} Nov 27 11:56:13 crc kubenswrapper[4807]: I1127 11:56:13.355759 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlgxm" event={"ID":"03500fd1-4bab-451f-8994-0732f2f92bfd","Type":"ContainerStarted","Data":"d88424f8173ef507a12a0c99b928e5763d7133d9b8ad9ba4cb2727fa4af8d821"} Nov 27 11:56:14 crc kubenswrapper[4807]: I1127 11:56:14.366421 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlgxm" event={"ID":"03500fd1-4bab-451f-8994-0732f2f92bfd","Type":"ContainerStarted","Data":"ffc7aa22b0bea8c2c02801d75101a638529c1ab3f268049eea9d07a3f1379bb7"} Nov 27 11:56:15 crc kubenswrapper[4807]: I1127 11:56:15.377594 4807 generic.go:334] "Generic (PLEG): container finished" podID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerID="ffc7aa22b0bea8c2c02801d75101a638529c1ab3f268049eea9d07a3f1379bb7" exitCode=0 Nov 27 11:56:15 crc kubenswrapper[4807]: I1127 11:56:15.377852 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlgxm" event={"ID":"03500fd1-4bab-451f-8994-0732f2f92bfd","Type":"ContainerDied","Data":"ffc7aa22b0bea8c2c02801d75101a638529c1ab3f268049eea9d07a3f1379bb7"} Nov 27 11:56:16 crc kubenswrapper[4807]: I1127 11:56:16.389423 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlgxm" event={"ID":"03500fd1-4bab-451f-8994-0732f2f92bfd","Type":"ContainerStarted","Data":"9b379ef9675b36cc3b402e52794c40a2e1fde88fb494485c21d8fd11b0fe3dea"} Nov 27 11:56:16 crc kubenswrapper[4807]: I1127 11:56:16.414124 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xlgxm" podStartSLOduration=1.9419216929999998 podStartE2EDuration="4.414106234s" podCreationTimestamp="2025-11-27 11:56:12 +0000 UTC" firstStartedPulling="2025-11-27 11:56:13.356990014 +0000 UTC m=+2814.456488212" lastFinishedPulling="2025-11-27 11:56:15.829174555 +0000 UTC m=+2816.928672753" observedRunningTime="2025-11-27 11:56:16.404990411 +0000 UTC m=+2817.504488609" watchObservedRunningTime="2025-11-27 11:56:16.414106234 +0000 UTC m=+2817.513604432" Nov 27 11:56:22 crc kubenswrapper[4807]: I1127 11:56:22.590877 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:22 crc kubenswrapper[4807]: I1127 11:56:22.591527 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:22 crc kubenswrapper[4807]: I1127 11:56:22.638632 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:23 crc kubenswrapper[4807]: I1127 11:56:23.489303 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:23 crc kubenswrapper[4807]: I1127 11:56:23.543806 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xlgxm"] Nov 27 11:56:25 crc kubenswrapper[4807]: I1127 11:56:25.507346 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xlgxm" podUID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerName="registry-server" containerID="cri-o://9b379ef9675b36cc3b402e52794c40a2e1fde88fb494485c21d8fd11b0fe3dea" gracePeriod=2 Nov 27 11:56:26 crc kubenswrapper[4807]: I1127 11:56:26.517109 4807 generic.go:334] "Generic (PLEG): container finished" podID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerID="9b379ef9675b36cc3b402e52794c40a2e1fde88fb494485c21d8fd11b0fe3dea" exitCode=0 Nov 27 11:56:26 crc kubenswrapper[4807]: I1127 11:56:26.517191 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlgxm" event={"ID":"03500fd1-4bab-451f-8994-0732f2f92bfd","Type":"ContainerDied","Data":"9b379ef9675b36cc3b402e52794c40a2e1fde88fb494485c21d8fd11b0fe3dea"} Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.289179 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.457441 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-catalog-content\") pod \"03500fd1-4bab-451f-8994-0732f2f92bfd\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.457920 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-utilities\") pod \"03500fd1-4bab-451f-8994-0732f2f92bfd\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.457971 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqs5g\" (UniqueName: \"kubernetes.io/projected/03500fd1-4bab-451f-8994-0732f2f92bfd-kube-api-access-fqs5g\") pod \"03500fd1-4bab-451f-8994-0732f2f92bfd\" (UID: \"03500fd1-4bab-451f-8994-0732f2f92bfd\") " Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.459112 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-utilities" (OuterVolumeSpecName: "utilities") pod "03500fd1-4bab-451f-8994-0732f2f92bfd" (UID: "03500fd1-4bab-451f-8994-0732f2f92bfd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.463977 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03500fd1-4bab-451f-8994-0732f2f92bfd-kube-api-access-fqs5g" (OuterVolumeSpecName: "kube-api-access-fqs5g") pod "03500fd1-4bab-451f-8994-0732f2f92bfd" (UID: "03500fd1-4bab-451f-8994-0732f2f92bfd"). InnerVolumeSpecName "kube-api-access-fqs5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.508714 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03500fd1-4bab-451f-8994-0732f2f92bfd" (UID: "03500fd1-4bab-451f-8994-0732f2f92bfd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.561316 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.561370 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqs5g\" (UniqueName: \"kubernetes.io/projected/03500fd1-4bab-451f-8994-0732f2f92bfd-kube-api-access-fqs5g\") on node \"crc\" DevicePath \"\"" Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.561388 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03500fd1-4bab-451f-8994-0732f2f92bfd-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.944687 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlgxm" event={"ID":"03500fd1-4bab-451f-8994-0732f2f92bfd","Type":"ContainerDied","Data":"d88424f8173ef507a12a0c99b928e5763d7133d9b8ad9ba4cb2727fa4af8d821"} Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.944747 4807 scope.go:117] "RemoveContainer" containerID="9b379ef9675b36cc3b402e52794c40a2e1fde88fb494485c21d8fd11b0fe3dea" Nov 27 11:56:28 crc kubenswrapper[4807]: I1127 11:56:28.944900 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlgxm" Nov 27 11:56:29 crc kubenswrapper[4807]: I1127 11:56:29.000795 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xlgxm"] Nov 27 11:56:29 crc kubenswrapper[4807]: I1127 11:56:29.002446 4807 scope.go:117] "RemoveContainer" containerID="ffc7aa22b0bea8c2c02801d75101a638529c1ab3f268049eea9d07a3f1379bb7" Nov 27 11:56:29 crc kubenswrapper[4807]: I1127 11:56:29.015904 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xlgxm"] Nov 27 11:56:29 crc kubenswrapper[4807]: I1127 11:56:29.024525 4807 scope.go:117] "RemoveContainer" containerID="96de1b624e678270f8f99cb3e316dc6916f4316436e5c45171507e00679982d7" Nov 27 11:56:29 crc kubenswrapper[4807]: I1127 11:56:29.542666 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03500fd1-4bab-451f-8994-0732f2f92bfd" path="/var/lib/kubelet/pods/03500fd1-4bab-451f-8994-0732f2f92bfd/volumes" Nov 27 11:56:50 crc kubenswrapper[4807]: I1127 11:56:50.922053 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:56:50 crc kubenswrapper[4807]: I1127 11:56:50.922604 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:57:20 crc kubenswrapper[4807]: I1127 11:57:20.922132 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:57:20 crc kubenswrapper[4807]: I1127 11:57:20.922754 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.185219 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshm"] Nov 27 11:57:39 crc kubenswrapper[4807]: E1127 11:57:39.186290 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerName="extract-utilities" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.186306 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerName="extract-utilities" Nov 27 11:57:39 crc kubenswrapper[4807]: E1127 11:57:39.186328 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerName="extract-content" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.186337 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerName="extract-content" Nov 27 11:57:39 crc kubenswrapper[4807]: E1127 11:57:39.186356 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerName="registry-server" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.186367 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerName="registry-server" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.186607 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="03500fd1-4bab-451f-8994-0732f2f92bfd" containerName="registry-server" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.188324 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.211354 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshm"] Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.237757 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-utilities\") pod \"redhat-marketplace-jxshm\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.237949 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-catalog-content\") pod \"redhat-marketplace-jxshm\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.238005 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9tdj\" (UniqueName: \"kubernetes.io/projected/820748d5-0663-4e03-8ad1-55857add064a-kube-api-access-m9tdj\") pod \"redhat-marketplace-jxshm\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.339908 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-catalog-content\") pod \"redhat-marketplace-jxshm\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.340047 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9tdj\" (UniqueName: \"kubernetes.io/projected/820748d5-0663-4e03-8ad1-55857add064a-kube-api-access-m9tdj\") pod \"redhat-marketplace-jxshm\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.340131 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-utilities\") pod \"redhat-marketplace-jxshm\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.340494 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-catalog-content\") pod \"redhat-marketplace-jxshm\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.340912 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-utilities\") pod \"redhat-marketplace-jxshm\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.361303 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9tdj\" (UniqueName: \"kubernetes.io/projected/820748d5-0663-4e03-8ad1-55857add064a-kube-api-access-m9tdj\") pod \"redhat-marketplace-jxshm\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:39 crc kubenswrapper[4807]: I1127 11:57:39.512349 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:40 crc kubenswrapper[4807]: I1127 11:57:40.067688 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshm"] Nov 27 11:57:40 crc kubenswrapper[4807]: I1127 11:57:40.618594 4807 generic.go:334] "Generic (PLEG): container finished" podID="820748d5-0663-4e03-8ad1-55857add064a" containerID="ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555" exitCode=0 Nov 27 11:57:40 crc kubenswrapper[4807]: I1127 11:57:40.618654 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshm" event={"ID":"820748d5-0663-4e03-8ad1-55857add064a","Type":"ContainerDied","Data":"ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555"} Nov 27 11:57:40 crc kubenswrapper[4807]: I1127 11:57:40.618894 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshm" event={"ID":"820748d5-0663-4e03-8ad1-55857add064a","Type":"ContainerStarted","Data":"ca66a43035d82c0cb36daaab164acf2d1059fae6a22576b1bd286c875923a142"} Nov 27 11:57:40 crc kubenswrapper[4807]: I1127 11:57:40.621098 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 11:57:42 crc kubenswrapper[4807]: I1127 11:57:42.639125 4807 generic.go:334] "Generic (PLEG): container finished" podID="820748d5-0663-4e03-8ad1-55857add064a" containerID="2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001" exitCode=0 Nov 27 11:57:42 crc kubenswrapper[4807]: I1127 11:57:42.639186 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshm" event={"ID":"820748d5-0663-4e03-8ad1-55857add064a","Type":"ContainerDied","Data":"2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001"} Nov 27 11:57:43 crc kubenswrapper[4807]: I1127 11:57:43.651867 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshm" event={"ID":"820748d5-0663-4e03-8ad1-55857add064a","Type":"ContainerStarted","Data":"d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc"} Nov 27 11:57:43 crc kubenswrapper[4807]: I1127 11:57:43.686120 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jxshm" podStartSLOduration=2.133019962 podStartE2EDuration="4.686095485s" podCreationTimestamp="2025-11-27 11:57:39 +0000 UTC" firstStartedPulling="2025-11-27 11:57:40.62088131 +0000 UTC m=+2901.720379508" lastFinishedPulling="2025-11-27 11:57:43.173956833 +0000 UTC m=+2904.273455031" observedRunningTime="2025-11-27 11:57:43.668974689 +0000 UTC m=+2904.768472897" watchObservedRunningTime="2025-11-27 11:57:43.686095485 +0000 UTC m=+2904.785593703" Nov 27 11:57:49 crc kubenswrapper[4807]: I1127 11:57:49.512796 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:49 crc kubenswrapper[4807]: I1127 11:57:49.513134 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:49 crc kubenswrapper[4807]: I1127 11:57:49.575424 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:49 crc kubenswrapper[4807]: I1127 11:57:49.776864 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:49 crc kubenswrapper[4807]: I1127 11:57:49.865268 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshm"] Nov 27 11:57:50 crc kubenswrapper[4807]: I1127 11:57:50.921845 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 11:57:50 crc kubenswrapper[4807]: I1127 11:57:50.922215 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 11:57:50 crc kubenswrapper[4807]: I1127 11:57:50.922285 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 11:57:50 crc kubenswrapper[4807]: I1127 11:57:50.923021 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1776cb085c8567a86319844c5502cf85aafc513996684e3735a9a2fa46d2137"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 11:57:50 crc kubenswrapper[4807]: I1127 11:57:50.923077 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://e1776cb085c8567a86319844c5502cf85aafc513996684e3735a9a2fa46d2137" gracePeriod=600 Nov 27 11:57:51 crc kubenswrapper[4807]: I1127 11:57:51.715145 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="e1776cb085c8567a86319844c5502cf85aafc513996684e3735a9a2fa46d2137" exitCode=0 Nov 27 11:57:51 crc kubenswrapper[4807]: I1127 11:57:51.715228 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"e1776cb085c8567a86319844c5502cf85aafc513996684e3735a9a2fa46d2137"} Nov 27 11:57:51 crc kubenswrapper[4807]: I1127 11:57:51.715947 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jxshm" podUID="820748d5-0663-4e03-8ad1-55857add064a" containerName="registry-server" containerID="cri-o://d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc" gracePeriod=2 Nov 27 11:57:51 crc kubenswrapper[4807]: I1127 11:57:51.716049 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6"} Nov 27 11:57:51 crc kubenswrapper[4807]: I1127 11:57:51.716117 4807 scope.go:117] "RemoveContainer" containerID="da97e8b4ad96255bb974e1dd37b834eeb59de949baef17af53a3028909ad2f9c" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.239974 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.386939 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9tdj\" (UniqueName: \"kubernetes.io/projected/820748d5-0663-4e03-8ad1-55857add064a-kube-api-access-m9tdj\") pod \"820748d5-0663-4e03-8ad1-55857add064a\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.387009 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-utilities\") pod \"820748d5-0663-4e03-8ad1-55857add064a\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.387133 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-catalog-content\") pod \"820748d5-0663-4e03-8ad1-55857add064a\" (UID: \"820748d5-0663-4e03-8ad1-55857add064a\") " Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.387921 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-utilities" (OuterVolumeSpecName: "utilities") pod "820748d5-0663-4e03-8ad1-55857add064a" (UID: "820748d5-0663-4e03-8ad1-55857add064a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.391545 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.392412 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/820748d5-0663-4e03-8ad1-55857add064a-kube-api-access-m9tdj" (OuterVolumeSpecName: "kube-api-access-m9tdj") pod "820748d5-0663-4e03-8ad1-55857add064a" (UID: "820748d5-0663-4e03-8ad1-55857add064a"). InnerVolumeSpecName "kube-api-access-m9tdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.403773 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "820748d5-0663-4e03-8ad1-55857add064a" (UID: "820748d5-0663-4e03-8ad1-55857add064a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.493018 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/820748d5-0663-4e03-8ad1-55857add064a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.493044 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9tdj\" (UniqueName: \"kubernetes.io/projected/820748d5-0663-4e03-8ad1-55857add064a-kube-api-access-m9tdj\") on node \"crc\" DevicePath \"\"" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.731538 4807 generic.go:334] "Generic (PLEG): container finished" podID="820748d5-0663-4e03-8ad1-55857add064a" containerID="d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc" exitCode=0 Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.731609 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxshm" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.731626 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshm" event={"ID":"820748d5-0663-4e03-8ad1-55857add064a","Type":"ContainerDied","Data":"d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc"} Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.731969 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshm" event={"ID":"820748d5-0663-4e03-8ad1-55857add064a","Type":"ContainerDied","Data":"ca66a43035d82c0cb36daaab164acf2d1059fae6a22576b1bd286c875923a142"} Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.731986 4807 scope.go:117] "RemoveContainer" containerID="d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.760458 4807 scope.go:117] "RemoveContainer" containerID="2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.774769 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshm"] Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.785657 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshm"] Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.797399 4807 scope.go:117] "RemoveContainer" containerID="ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.842451 4807 scope.go:117] "RemoveContainer" containerID="d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc" Nov 27 11:57:52 crc kubenswrapper[4807]: E1127 11:57:52.843015 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc\": container with ID starting with d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc not found: ID does not exist" containerID="d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.843069 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc"} err="failed to get container status \"d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc\": rpc error: code = NotFound desc = could not find container \"d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc\": container with ID starting with d03dd6167faf8357f917c72b515c8a7876d7af13088b809f3d7eef948afb98dc not found: ID does not exist" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.843107 4807 scope.go:117] "RemoveContainer" containerID="2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001" Nov 27 11:57:52 crc kubenswrapper[4807]: E1127 11:57:52.843675 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001\": container with ID starting with 2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001 not found: ID does not exist" containerID="2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.843718 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001"} err="failed to get container status \"2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001\": rpc error: code = NotFound desc = could not find container \"2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001\": container with ID starting with 2b42567ca8bb088d47751d694c55fed8cecfc7677a36e92001271e0a6e149001 not found: ID does not exist" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.843750 4807 scope.go:117] "RemoveContainer" containerID="ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555" Nov 27 11:57:52 crc kubenswrapper[4807]: E1127 11:57:52.844017 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555\": container with ID starting with ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555 not found: ID does not exist" containerID="ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555" Nov 27 11:57:52 crc kubenswrapper[4807]: I1127 11:57:52.844066 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555"} err="failed to get container status \"ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555\": rpc error: code = NotFound desc = could not find container \"ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555\": container with ID starting with ebd33f211f2ad59e4579b54580cad05adf5b1774abc1200f4c062847ea70d555 not found: ID does not exist" Nov 27 11:57:53 crc kubenswrapper[4807]: I1127 11:57:53.541324 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="820748d5-0663-4e03-8ad1-55857add064a" path="/var/lib/kubelet/pods/820748d5-0663-4e03-8ad1-55857add064a/volumes" Nov 27 11:58:31 crc kubenswrapper[4807]: I1127 11:58:31.788896 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dcp88"] Nov 27 11:58:31 crc kubenswrapper[4807]: E1127 11:58:31.789895 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820748d5-0663-4e03-8ad1-55857add064a" containerName="registry-server" Nov 27 11:58:31 crc kubenswrapper[4807]: I1127 11:58:31.789912 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="820748d5-0663-4e03-8ad1-55857add064a" containerName="registry-server" Nov 27 11:58:31 crc kubenswrapper[4807]: E1127 11:58:31.789922 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820748d5-0663-4e03-8ad1-55857add064a" containerName="extract-content" Nov 27 11:58:31 crc kubenswrapper[4807]: I1127 11:58:31.789929 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="820748d5-0663-4e03-8ad1-55857add064a" containerName="extract-content" Nov 27 11:58:31 crc kubenswrapper[4807]: E1127 11:58:31.789966 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="820748d5-0663-4e03-8ad1-55857add064a" containerName="extract-utilities" Nov 27 11:58:31 crc kubenswrapper[4807]: I1127 11:58:31.789973 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="820748d5-0663-4e03-8ad1-55857add064a" containerName="extract-utilities" Nov 27 11:58:31 crc kubenswrapper[4807]: I1127 11:58:31.790165 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="820748d5-0663-4e03-8ad1-55857add064a" containerName="registry-server" Nov 27 11:58:31 crc kubenswrapper[4807]: I1127 11:58:31.791629 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:31 crc kubenswrapper[4807]: I1127 11:58:31.804728 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dcp88"] Nov 27 11:58:31 crc kubenswrapper[4807]: I1127 11:58:31.922092 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7n5z\" (UniqueName: \"kubernetes.io/projected/5684b3fe-b3f3-4381-9c6e-847e327f1497-kube-api-access-j7n5z\") pod \"certified-operators-dcp88\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:31 crc kubenswrapper[4807]: I1127 11:58:31.922200 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-utilities\") pod \"certified-operators-dcp88\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:31 crc kubenswrapper[4807]: I1127 11:58:31.922245 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-catalog-content\") pod \"certified-operators-dcp88\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:32 crc kubenswrapper[4807]: I1127 11:58:32.031975 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-catalog-content\") pod \"certified-operators-dcp88\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:32 crc kubenswrapper[4807]: I1127 11:58:32.032139 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7n5z\" (UniqueName: \"kubernetes.io/projected/5684b3fe-b3f3-4381-9c6e-847e327f1497-kube-api-access-j7n5z\") pod \"certified-operators-dcp88\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:32 crc kubenswrapper[4807]: I1127 11:58:32.032231 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-utilities\") pod \"certified-operators-dcp88\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:32 crc kubenswrapper[4807]: I1127 11:58:32.032748 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-utilities\") pod \"certified-operators-dcp88\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:32 crc kubenswrapper[4807]: I1127 11:58:32.032994 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-catalog-content\") pod \"certified-operators-dcp88\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:32 crc kubenswrapper[4807]: I1127 11:58:32.058158 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7n5z\" (UniqueName: \"kubernetes.io/projected/5684b3fe-b3f3-4381-9c6e-847e327f1497-kube-api-access-j7n5z\") pod \"certified-operators-dcp88\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:32 crc kubenswrapper[4807]: I1127 11:58:32.118044 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:32 crc kubenswrapper[4807]: I1127 11:58:32.640152 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dcp88"] Nov 27 11:58:32 crc kubenswrapper[4807]: W1127 11:58:32.646721 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5684b3fe_b3f3_4381_9c6e_847e327f1497.slice/crio-58a33b20f1e63a09a65c4955bcf97eafd77d698386dd32736baef5666e13b2e0 WatchSource:0}: Error finding container 58a33b20f1e63a09a65c4955bcf97eafd77d698386dd32736baef5666e13b2e0: Status 404 returned error can't find the container with id 58a33b20f1e63a09a65c4955bcf97eafd77d698386dd32736baef5666e13b2e0 Nov 27 11:58:33 crc kubenswrapper[4807]: I1127 11:58:33.098956 4807 generic.go:334] "Generic (PLEG): container finished" podID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerID="56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd" exitCode=0 Nov 27 11:58:33 crc kubenswrapper[4807]: I1127 11:58:33.099016 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcp88" event={"ID":"5684b3fe-b3f3-4381-9c6e-847e327f1497","Type":"ContainerDied","Data":"56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd"} Nov 27 11:58:33 crc kubenswrapper[4807]: I1127 11:58:33.099351 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcp88" event={"ID":"5684b3fe-b3f3-4381-9c6e-847e327f1497","Type":"ContainerStarted","Data":"58a33b20f1e63a09a65c4955bcf97eafd77d698386dd32736baef5666e13b2e0"} Nov 27 11:58:36 crc kubenswrapper[4807]: I1127 11:58:36.139605 4807 generic.go:334] "Generic (PLEG): container finished" podID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerID="cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8" exitCode=0 Nov 27 11:58:36 crc kubenswrapper[4807]: I1127 11:58:36.139710 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcp88" event={"ID":"5684b3fe-b3f3-4381-9c6e-847e327f1497","Type":"ContainerDied","Data":"cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8"} Nov 27 11:58:38 crc kubenswrapper[4807]: I1127 11:58:38.160067 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcp88" event={"ID":"5684b3fe-b3f3-4381-9c6e-847e327f1497","Type":"ContainerStarted","Data":"95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa"} Nov 27 11:58:38 crc kubenswrapper[4807]: I1127 11:58:38.181190 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dcp88" podStartSLOduration=3.298923744 podStartE2EDuration="7.181158706s" podCreationTimestamp="2025-11-27 11:58:31 +0000 UTC" firstStartedPulling="2025-11-27 11:58:33.101136884 +0000 UTC m=+2954.200635082" lastFinishedPulling="2025-11-27 11:58:36.983371836 +0000 UTC m=+2958.082870044" observedRunningTime="2025-11-27 11:58:38.177240602 +0000 UTC m=+2959.276738820" watchObservedRunningTime="2025-11-27 11:58:38.181158706 +0000 UTC m=+2959.280656904" Nov 27 11:58:42 crc kubenswrapper[4807]: I1127 11:58:42.119196 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:42 crc kubenswrapper[4807]: I1127 11:58:42.120342 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:42 crc kubenswrapper[4807]: I1127 11:58:42.171578 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:42 crc kubenswrapper[4807]: I1127 11:58:42.241544 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:42 crc kubenswrapper[4807]: I1127 11:58:42.405777 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dcp88"] Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.215361 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dcp88" podUID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerName="registry-server" containerID="cri-o://95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa" gracePeriod=2 Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.698196 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.794974 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-utilities\") pod \"5684b3fe-b3f3-4381-9c6e-847e327f1497\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.795032 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7n5z\" (UniqueName: \"kubernetes.io/projected/5684b3fe-b3f3-4381-9c6e-847e327f1497-kube-api-access-j7n5z\") pod \"5684b3fe-b3f3-4381-9c6e-847e327f1497\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.795208 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-catalog-content\") pod \"5684b3fe-b3f3-4381-9c6e-847e327f1497\" (UID: \"5684b3fe-b3f3-4381-9c6e-847e327f1497\") " Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.797494 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-utilities" (OuterVolumeSpecName: "utilities") pod "5684b3fe-b3f3-4381-9c6e-847e327f1497" (UID: "5684b3fe-b3f3-4381-9c6e-847e327f1497"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.805311 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5684b3fe-b3f3-4381-9c6e-847e327f1497-kube-api-access-j7n5z" (OuterVolumeSpecName: "kube-api-access-j7n5z") pod "5684b3fe-b3f3-4381-9c6e-847e327f1497" (UID: "5684b3fe-b3f3-4381-9c6e-847e327f1497"). InnerVolumeSpecName "kube-api-access-j7n5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.859564 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5684b3fe-b3f3-4381-9c6e-847e327f1497" (UID: "5684b3fe-b3f3-4381-9c6e-847e327f1497"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.897714 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.897943 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5684b3fe-b3f3-4381-9c6e-847e327f1497-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 11:58:44 crc kubenswrapper[4807]: I1127 11:58:44.898005 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7n5z\" (UniqueName: \"kubernetes.io/projected/5684b3fe-b3f3-4381-9c6e-847e327f1497-kube-api-access-j7n5z\") on node \"crc\" DevicePath \"\"" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.228805 4807 generic.go:334] "Generic (PLEG): container finished" podID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerID="95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa" exitCode=0 Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.228851 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcp88" event={"ID":"5684b3fe-b3f3-4381-9c6e-847e327f1497","Type":"ContainerDied","Data":"95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa"} Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.228856 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dcp88" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.228879 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dcp88" event={"ID":"5684b3fe-b3f3-4381-9c6e-847e327f1497","Type":"ContainerDied","Data":"58a33b20f1e63a09a65c4955bcf97eafd77d698386dd32736baef5666e13b2e0"} Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.228902 4807 scope.go:117] "RemoveContainer" containerID="95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.269392 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dcp88"] Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.271071 4807 scope.go:117] "RemoveContainer" containerID="cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.278084 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dcp88"] Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.293271 4807 scope.go:117] "RemoveContainer" containerID="56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.334946 4807 scope.go:117] "RemoveContainer" containerID="95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa" Nov 27 11:58:45 crc kubenswrapper[4807]: E1127 11:58:45.335547 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa\": container with ID starting with 95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa not found: ID does not exist" containerID="95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.335592 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa"} err="failed to get container status \"95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa\": rpc error: code = NotFound desc = could not find container \"95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa\": container with ID starting with 95bee62228f0597f8871e62fda64d6ba96c4a117b720538cc7082d63341e82fa not found: ID does not exist" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.335620 4807 scope.go:117] "RemoveContainer" containerID="cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8" Nov 27 11:58:45 crc kubenswrapper[4807]: E1127 11:58:45.336121 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8\": container with ID starting with cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8 not found: ID does not exist" containerID="cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.336150 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8"} err="failed to get container status \"cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8\": rpc error: code = NotFound desc = could not find container \"cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8\": container with ID starting with cbc00f2030729f447968389c617377814ad635b16a603046975b4ac9547133d8 not found: ID does not exist" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.336172 4807 scope.go:117] "RemoveContainer" containerID="56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd" Nov 27 11:58:45 crc kubenswrapper[4807]: E1127 11:58:45.336776 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd\": container with ID starting with 56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd not found: ID does not exist" containerID="56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.336819 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd"} err="failed to get container status \"56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd\": rpc error: code = NotFound desc = could not find container \"56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd\": container with ID starting with 56f48a143b9013f0bc7c15f563e9401bf6498c23f0219b0147eeb5210fa29ecd not found: ID does not exist" Nov 27 11:58:45 crc kubenswrapper[4807]: I1127 11:58:45.543464 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5684b3fe-b3f3-4381-9c6e-847e327f1497" path="/var/lib/kubelet/pods/5684b3fe-b3f3-4381-9c6e-847e327f1497/volumes" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.148977 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh"] Nov 27 12:00:00 crc kubenswrapper[4807]: E1127 12:00:00.149950 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerName="extract-content" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.149972 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerName="extract-content" Nov 27 12:00:00 crc kubenswrapper[4807]: E1127 12:00:00.149996 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerName="extract-utilities" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.150002 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerName="extract-utilities" Nov 27 12:00:00 crc kubenswrapper[4807]: E1127 12:00:00.150020 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerName="registry-server" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.150026 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerName="registry-server" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.150202 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5684b3fe-b3f3-4381-9c6e-847e327f1497" containerName="registry-server" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.150896 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.152796 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.153451 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.157229 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh"] Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.264148 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn654\" (UniqueName: \"kubernetes.io/projected/b36a9afe-4035-4c32-a8b3-d405da1d779f-kube-api-access-xn654\") pod \"collect-profiles-29404080-kktgh\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.264529 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b36a9afe-4035-4c32-a8b3-d405da1d779f-secret-volume\") pod \"collect-profiles-29404080-kktgh\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.264752 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b36a9afe-4035-4c32-a8b3-d405da1d779f-config-volume\") pod \"collect-profiles-29404080-kktgh\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.366205 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn654\" (UniqueName: \"kubernetes.io/projected/b36a9afe-4035-4c32-a8b3-d405da1d779f-kube-api-access-xn654\") pod \"collect-profiles-29404080-kktgh\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.366342 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b36a9afe-4035-4c32-a8b3-d405da1d779f-secret-volume\") pod \"collect-profiles-29404080-kktgh\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.366407 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b36a9afe-4035-4c32-a8b3-d405da1d779f-config-volume\") pod \"collect-profiles-29404080-kktgh\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.367201 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b36a9afe-4035-4c32-a8b3-d405da1d779f-config-volume\") pod \"collect-profiles-29404080-kktgh\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.373630 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b36a9afe-4035-4c32-a8b3-d405da1d779f-secret-volume\") pod \"collect-profiles-29404080-kktgh\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.382050 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn654\" (UniqueName: \"kubernetes.io/projected/b36a9afe-4035-4c32-a8b3-d405da1d779f-kube-api-access-xn654\") pod \"collect-profiles-29404080-kktgh\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.480803 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:00 crc kubenswrapper[4807]: I1127 12:00:00.920094 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh"] Nov 27 12:00:01 crc kubenswrapper[4807]: I1127 12:00:01.906289 4807 generic.go:334] "Generic (PLEG): container finished" podID="b36a9afe-4035-4c32-a8b3-d405da1d779f" containerID="b0cab061357d5f563041faf9555e9eba74f668c0167815173a17d77909a6f96d" exitCode=0 Nov 27 12:00:01 crc kubenswrapper[4807]: I1127 12:00:01.906354 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" event={"ID":"b36a9afe-4035-4c32-a8b3-d405da1d779f","Type":"ContainerDied","Data":"b0cab061357d5f563041faf9555e9eba74f668c0167815173a17d77909a6f96d"} Nov 27 12:00:01 crc kubenswrapper[4807]: I1127 12:00:01.907293 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" event={"ID":"b36a9afe-4035-4c32-a8b3-d405da1d779f","Type":"ContainerStarted","Data":"9335b5f62d085997b0d8390341b563f7aeeb8b637c6ff791994105c2c9ac5a2b"} Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.259962 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.323856 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn654\" (UniqueName: \"kubernetes.io/projected/b36a9afe-4035-4c32-a8b3-d405da1d779f-kube-api-access-xn654\") pod \"b36a9afe-4035-4c32-a8b3-d405da1d779f\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.323909 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b36a9afe-4035-4c32-a8b3-d405da1d779f-config-volume\") pod \"b36a9afe-4035-4c32-a8b3-d405da1d779f\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.323962 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b36a9afe-4035-4c32-a8b3-d405da1d779f-secret-volume\") pod \"b36a9afe-4035-4c32-a8b3-d405da1d779f\" (UID: \"b36a9afe-4035-4c32-a8b3-d405da1d779f\") " Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.324900 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b36a9afe-4035-4c32-a8b3-d405da1d779f-config-volume" (OuterVolumeSpecName: "config-volume") pod "b36a9afe-4035-4c32-a8b3-d405da1d779f" (UID: "b36a9afe-4035-4c32-a8b3-d405da1d779f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.330446 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b36a9afe-4035-4c32-a8b3-d405da1d779f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b36a9afe-4035-4c32-a8b3-d405da1d779f" (UID: "b36a9afe-4035-4c32-a8b3-d405da1d779f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.330676 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b36a9afe-4035-4c32-a8b3-d405da1d779f-kube-api-access-xn654" (OuterVolumeSpecName: "kube-api-access-xn654") pod "b36a9afe-4035-4c32-a8b3-d405da1d779f" (UID: "b36a9afe-4035-4c32-a8b3-d405da1d779f"). InnerVolumeSpecName "kube-api-access-xn654". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.426527 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn654\" (UniqueName: \"kubernetes.io/projected/b36a9afe-4035-4c32-a8b3-d405da1d779f-kube-api-access-xn654\") on node \"crc\" DevicePath \"\"" Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.426579 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b36a9afe-4035-4c32-a8b3-d405da1d779f-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.426592 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b36a9afe-4035-4c32-a8b3-d405da1d779f-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.934137 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" event={"ID":"b36a9afe-4035-4c32-a8b3-d405da1d779f","Type":"ContainerDied","Data":"9335b5f62d085997b0d8390341b563f7aeeb8b637c6ff791994105c2c9ac5a2b"} Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.934768 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9335b5f62d085997b0d8390341b563f7aeeb8b637c6ff791994105c2c9ac5a2b" Nov 27 12:00:03 crc kubenswrapper[4807]: I1127 12:00:03.934184 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404080-kktgh" Nov 27 12:00:04 crc kubenswrapper[4807]: I1127 12:00:04.331501 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch"] Nov 27 12:00:04 crc kubenswrapper[4807]: I1127 12:00:04.341072 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404035-l9jch"] Nov 27 12:00:05 crc kubenswrapper[4807]: I1127 12:00:05.546340 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81fc3365-a2d9-4adb-a884-97c8c4b6d3f2" path="/var/lib/kubelet/pods/81fc3365-a2d9-4adb-a884-97c8c4b6d3f2/volumes" Nov 27 12:00:20 crc kubenswrapper[4807]: I1127 12:00:20.921868 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 12:00:20 crc kubenswrapper[4807]: I1127 12:00:20.922941 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 12:00:26 crc kubenswrapper[4807]: I1127 12:00:26.181686 4807 scope.go:117] "RemoveContainer" containerID="5f57af6322a1d584b36a0620f6ed4ebfe5b176473ea50ee276e63b051d449ebc" Nov 27 12:00:50 crc kubenswrapper[4807]: I1127 12:00:50.922173 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 12:00:50 crc kubenswrapper[4807]: I1127 12:00:50.922809 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.145021 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29404081-8hnvz"] Nov 27 12:01:00 crc kubenswrapper[4807]: E1127 12:01:00.145990 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b36a9afe-4035-4c32-a8b3-d405da1d779f" containerName="collect-profiles" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.146004 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="b36a9afe-4035-4c32-a8b3-d405da1d779f" containerName="collect-profiles" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.146178 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="b36a9afe-4035-4c32-a8b3-d405da1d779f" containerName="collect-profiles" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.147134 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.158671 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29404081-8hnvz"] Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.256576 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-combined-ca-bundle\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.256653 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-config-data\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.256681 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-fernet-keys\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.256802 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9l8\" (UniqueName: \"kubernetes.io/projected/d0a23453-ffa9-450e-a401-8f3c4a917196-kube-api-access-nf9l8\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.358591 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-combined-ca-bundle\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.358664 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-config-data\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.358685 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-fernet-keys\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.358731 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9l8\" (UniqueName: \"kubernetes.io/projected/d0a23453-ffa9-450e-a401-8f3c4a917196-kube-api-access-nf9l8\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.364791 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-combined-ca-bundle\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.365843 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-fernet-keys\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.369494 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-config-data\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.379410 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9l8\" (UniqueName: \"kubernetes.io/projected/d0a23453-ffa9-450e-a401-8f3c4a917196-kube-api-access-nf9l8\") pod \"keystone-cron-29404081-8hnvz\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:00 crc kubenswrapper[4807]: I1127 12:01:00.463210 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:01 crc kubenswrapper[4807]: I1127 12:01:01.025784 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29404081-8hnvz"] Nov 27 12:01:01 crc kubenswrapper[4807]: I1127 12:01:01.447082 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404081-8hnvz" event={"ID":"d0a23453-ffa9-450e-a401-8f3c4a917196","Type":"ContainerStarted","Data":"52bd343facf3a06bd13fe7325edaa7afd4846e78d980a87aa1cf06a0337e4d5b"} Nov 27 12:01:01 crc kubenswrapper[4807]: I1127 12:01:01.447498 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404081-8hnvz" event={"ID":"d0a23453-ffa9-450e-a401-8f3c4a917196","Type":"ContainerStarted","Data":"f0aea5893a3962d7475ea3e3f494e0781e60699628bb52897f71c3bbc2893287"} Nov 27 12:01:01 crc kubenswrapper[4807]: I1127 12:01:01.464137 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29404081-8hnvz" podStartSLOduration=1.464120729 podStartE2EDuration="1.464120729s" podCreationTimestamp="2025-11-27 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 12:01:01.462713632 +0000 UTC m=+3102.562211820" watchObservedRunningTime="2025-11-27 12:01:01.464120729 +0000 UTC m=+3102.563618927" Nov 27 12:01:03 crc kubenswrapper[4807]: I1127 12:01:03.472522 4807 generic.go:334] "Generic (PLEG): container finished" podID="d0a23453-ffa9-450e-a401-8f3c4a917196" containerID="52bd343facf3a06bd13fe7325edaa7afd4846e78d980a87aa1cf06a0337e4d5b" exitCode=0 Nov 27 12:01:03 crc kubenswrapper[4807]: I1127 12:01:03.472620 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404081-8hnvz" event={"ID":"d0a23453-ffa9-450e-a401-8f3c4a917196","Type":"ContainerDied","Data":"52bd343facf3a06bd13fe7325edaa7afd4846e78d980a87aa1cf06a0337e4d5b"} Nov 27 12:01:04 crc kubenswrapper[4807]: I1127 12:01:04.864069 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:04 crc kubenswrapper[4807]: I1127 12:01:04.942396 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-combined-ca-bundle\") pod \"d0a23453-ffa9-450e-a401-8f3c4a917196\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " Nov 27 12:01:04 crc kubenswrapper[4807]: I1127 12:01:04.942481 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf9l8\" (UniqueName: \"kubernetes.io/projected/d0a23453-ffa9-450e-a401-8f3c4a917196-kube-api-access-nf9l8\") pod \"d0a23453-ffa9-450e-a401-8f3c4a917196\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " Nov 27 12:01:04 crc kubenswrapper[4807]: I1127 12:01:04.942534 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-config-data\") pod \"d0a23453-ffa9-450e-a401-8f3c4a917196\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " Nov 27 12:01:04 crc kubenswrapper[4807]: I1127 12:01:04.942581 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-fernet-keys\") pod \"d0a23453-ffa9-450e-a401-8f3c4a917196\" (UID: \"d0a23453-ffa9-450e-a401-8f3c4a917196\") " Nov 27 12:01:04 crc kubenswrapper[4807]: I1127 12:01:04.947811 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d0a23453-ffa9-450e-a401-8f3c4a917196" (UID: "d0a23453-ffa9-450e-a401-8f3c4a917196"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 12:01:04 crc kubenswrapper[4807]: I1127 12:01:04.949031 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a23453-ffa9-450e-a401-8f3c4a917196-kube-api-access-nf9l8" (OuterVolumeSpecName: "kube-api-access-nf9l8") pod "d0a23453-ffa9-450e-a401-8f3c4a917196" (UID: "d0a23453-ffa9-450e-a401-8f3c4a917196"). InnerVolumeSpecName "kube-api-access-nf9l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:01:04 crc kubenswrapper[4807]: I1127 12:01:04.974221 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0a23453-ffa9-450e-a401-8f3c4a917196" (UID: "d0a23453-ffa9-450e-a401-8f3c4a917196"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 12:01:04 crc kubenswrapper[4807]: I1127 12:01:04.996624 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-config-data" (OuterVolumeSpecName: "config-data") pod "d0a23453-ffa9-450e-a401-8f3c4a917196" (UID: "d0a23453-ffa9-450e-a401-8f3c4a917196"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 12:01:05 crc kubenswrapper[4807]: I1127 12:01:05.044909 4807 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 27 12:01:05 crc kubenswrapper[4807]: I1127 12:01:05.044940 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf9l8\" (UniqueName: \"kubernetes.io/projected/d0a23453-ffa9-450e-a401-8f3c4a917196-kube-api-access-nf9l8\") on node \"crc\" DevicePath \"\"" Nov 27 12:01:05 crc kubenswrapper[4807]: I1127 12:01:05.045101 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 12:01:05 crc kubenswrapper[4807]: I1127 12:01:05.045112 4807 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d0a23453-ffa9-450e-a401-8f3c4a917196-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 27 12:01:05 crc kubenswrapper[4807]: I1127 12:01:05.493550 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29404081-8hnvz" event={"ID":"d0a23453-ffa9-450e-a401-8f3c4a917196","Type":"ContainerDied","Data":"f0aea5893a3962d7475ea3e3f494e0781e60699628bb52897f71c3bbc2893287"} Nov 27 12:01:05 crc kubenswrapper[4807]: I1127 12:01:05.493595 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0aea5893a3962d7475ea3e3f494e0781e60699628bb52897f71c3bbc2893287" Nov 27 12:01:05 crc kubenswrapper[4807]: I1127 12:01:05.493594 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29404081-8hnvz" Nov 27 12:01:20 crc kubenswrapper[4807]: I1127 12:01:20.921508 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 12:01:20 crc kubenswrapper[4807]: I1127 12:01:20.922092 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 12:01:20 crc kubenswrapper[4807]: I1127 12:01:20.922141 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 12:01:20 crc kubenswrapper[4807]: I1127 12:01:20.922952 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 12:01:20 crc kubenswrapper[4807]: I1127 12:01:20.923016 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" gracePeriod=600 Nov 27 12:01:21 crc kubenswrapper[4807]: E1127 12:01:21.045721 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:01:21 crc kubenswrapper[4807]: I1127 12:01:21.635053 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" exitCode=0 Nov 27 12:01:21 crc kubenswrapper[4807]: I1127 12:01:21.635145 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6"} Nov 27 12:01:21 crc kubenswrapper[4807]: I1127 12:01:21.635394 4807 scope.go:117] "RemoveContainer" containerID="e1776cb085c8567a86319844c5502cf85aafc513996684e3735a9a2fa46d2137" Nov 27 12:01:21 crc kubenswrapper[4807]: I1127 12:01:21.636120 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:01:21 crc kubenswrapper[4807]: E1127 12:01:21.636451 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:01:26 crc kubenswrapper[4807]: I1127 12:01:26.241351 4807 scope.go:117] "RemoveContainer" containerID="fba5d6540786ee88c4a60ac853172ca298da435e572dd9c38925e7f01c93a03d" Nov 27 12:01:26 crc kubenswrapper[4807]: I1127 12:01:26.266971 4807 scope.go:117] "RemoveContainer" containerID="ca4ac6c041813c986513423820d103c99d2ab70bafccc8f57b12ee12676ae350" Nov 27 12:01:26 crc kubenswrapper[4807]: I1127 12:01:26.322493 4807 scope.go:117] "RemoveContainer" containerID="0299a09bf9965e87ecf915d6e21132213c372d588d0a731e7376af831222325a" Nov 27 12:01:32 crc kubenswrapper[4807]: I1127 12:01:32.533085 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:01:32 crc kubenswrapper[4807]: E1127 12:01:32.533845 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:01:43 crc kubenswrapper[4807]: I1127 12:01:43.532273 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:01:43 crc kubenswrapper[4807]: E1127 12:01:43.533057 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:01:58 crc kubenswrapper[4807]: I1127 12:01:58.532039 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:01:58 crc kubenswrapper[4807]: E1127 12:01:58.532787 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:02:13 crc kubenswrapper[4807]: I1127 12:02:13.532855 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:02:13 crc kubenswrapper[4807]: E1127 12:02:13.534772 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:02:28 crc kubenswrapper[4807]: I1127 12:02:28.531990 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:02:28 crc kubenswrapper[4807]: E1127 12:02:28.532834 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:02:40 crc kubenswrapper[4807]: I1127 12:02:40.532473 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:02:40 crc kubenswrapper[4807]: E1127 12:02:40.533338 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:02:52 crc kubenswrapper[4807]: I1127 12:02:52.532674 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:02:52 crc kubenswrapper[4807]: E1127 12:02:52.533615 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:03:07 crc kubenswrapper[4807]: I1127 12:03:07.536312 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:03:07 crc kubenswrapper[4807]: E1127 12:03:07.537156 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:03:18 crc kubenswrapper[4807]: I1127 12:03:18.532841 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:03:18 crc kubenswrapper[4807]: E1127 12:03:18.534682 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:03:31 crc kubenswrapper[4807]: I1127 12:03:31.533726 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:03:31 crc kubenswrapper[4807]: E1127 12:03:31.534419 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:03:31 crc kubenswrapper[4807]: I1127 12:03:31.818339 4807 generic.go:334] "Generic (PLEG): container finished" podID="203f3a06-5cde-4778-837a-90fbfde39772" containerID="6bbc4d00fd9c3ece20dbdeda954bbdf304df4f640dde36c2bff58a77c17df317" exitCode=0 Nov 27 12:03:31 crc kubenswrapper[4807]: I1127 12:03:31.818391 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"203f3a06-5cde-4778-837a-90fbfde39772","Type":"ContainerDied","Data":"6bbc4d00fd9c3ece20dbdeda954bbdf304df4f640dde36c2bff58a77c17df317"} Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.251789 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.330809 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trp2n\" (UniqueName: \"kubernetes.io/projected/203f3a06-5cde-4778-837a-90fbfde39772-kube-api-access-trp2n\") pod \"203f3a06-5cde-4778-837a-90fbfde39772\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.330877 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-workdir\") pod \"203f3a06-5cde-4778-837a-90fbfde39772\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.331047 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ssh-key\") pod \"203f3a06-5cde-4778-837a-90fbfde39772\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.331077 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config-secret\") pod \"203f3a06-5cde-4778-837a-90fbfde39772\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.331118 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config\") pod \"203f3a06-5cde-4778-837a-90fbfde39772\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.331182 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ca-certs\") pod \"203f3a06-5cde-4778-837a-90fbfde39772\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.331281 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-config-data\") pod \"203f3a06-5cde-4778-837a-90fbfde39772\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.331324 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-temporary\") pod \"203f3a06-5cde-4778-837a-90fbfde39772\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.331360 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"203f3a06-5cde-4778-837a-90fbfde39772\" (UID: \"203f3a06-5cde-4778-837a-90fbfde39772\") " Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.332966 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "203f3a06-5cde-4778-837a-90fbfde39772" (UID: "203f3a06-5cde-4778-837a-90fbfde39772"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.333420 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-config-data" (OuterVolumeSpecName: "config-data") pod "203f3a06-5cde-4778-837a-90fbfde39772" (UID: "203f3a06-5cde-4778-837a-90fbfde39772"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.335496 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "203f3a06-5cde-4778-837a-90fbfde39772" (UID: "203f3a06-5cde-4778-837a-90fbfde39772"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.348952 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203f3a06-5cde-4778-837a-90fbfde39772-kube-api-access-trp2n" (OuterVolumeSpecName: "kube-api-access-trp2n") pod "203f3a06-5cde-4778-837a-90fbfde39772" (UID: "203f3a06-5cde-4778-837a-90fbfde39772"). InnerVolumeSpecName "kube-api-access-trp2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.349224 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "203f3a06-5cde-4778-837a-90fbfde39772" (UID: "203f3a06-5cde-4778-837a-90fbfde39772"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.359160 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "203f3a06-5cde-4778-837a-90fbfde39772" (UID: "203f3a06-5cde-4778-837a-90fbfde39772"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.364938 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "203f3a06-5cde-4778-837a-90fbfde39772" (UID: "203f3a06-5cde-4778-837a-90fbfde39772"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.378973 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "203f3a06-5cde-4778-837a-90fbfde39772" (UID: "203f3a06-5cde-4778-837a-90fbfde39772"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.416477 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "203f3a06-5cde-4778-837a-90fbfde39772" (UID: "203f3a06-5cde-4778-837a-90fbfde39772"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.433778 4807 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-config-data\") on node \"crc\" DevicePath \"\"" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.433812 4807 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.433846 4807 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.433856 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trp2n\" (UniqueName: \"kubernetes.io/projected/203f3a06-5cde-4778-837a-90fbfde39772-kube-api-access-trp2n\") on node \"crc\" DevicePath \"\"" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.433867 4807 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/203f3a06-5cde-4778-837a-90fbfde39772-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.433875 4807 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.433885 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.433893 4807 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/203f3a06-5cde-4778-837a-90fbfde39772-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.433904 4807 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/203f3a06-5cde-4778-837a-90fbfde39772-ca-certs\") on node \"crc\" DevicePath \"\"" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.457509 4807 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.539074 4807 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.836484 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"203f3a06-5cde-4778-837a-90fbfde39772","Type":"ContainerDied","Data":"e8cc14f86d11eb7fb50857bafef97e3e2e4ed9560682114efd1257cf09065d17"} Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.836536 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8cc14f86d11eb7fb50857bafef97e3e2e4ed9560682114efd1257cf09065d17" Nov 27 12:03:33 crc kubenswrapper[4807]: I1127 12:03:33.836600 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Nov 27 12:03:42 crc kubenswrapper[4807]: I1127 12:03:42.533101 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:03:42 crc kubenswrapper[4807]: E1127 12:03:42.533927 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.017349 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 27 12:03:45 crc kubenswrapper[4807]: E1127 12:03:45.018396 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203f3a06-5cde-4778-837a-90fbfde39772" containerName="tempest-tests-tempest-tests-runner" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.018424 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="203f3a06-5cde-4778-837a-90fbfde39772" containerName="tempest-tests-tempest-tests-runner" Nov 27 12:03:45 crc kubenswrapper[4807]: E1127 12:03:45.018450 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a23453-ffa9-450e-a401-8f3c4a917196" containerName="keystone-cron" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.018461 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a23453-ffa9-450e-a401-8f3c4a917196" containerName="keystone-cron" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.018815 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a23453-ffa9-450e-a401-8f3c4a917196" containerName="keystone-cron" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.018852 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="203f3a06-5cde-4778-837a-90fbfde39772" containerName="tempest-tests-tempest-tests-runner" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.020104 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.022499 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-44p59" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.035388 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.162471 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e67a9ba6-daee-4e19-bf83-d51152329c5c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.162651 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-855qv\" (UniqueName: \"kubernetes.io/projected/e67a9ba6-daee-4e19-bf83-d51152329c5c-kube-api-access-855qv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e67a9ba6-daee-4e19-bf83-d51152329c5c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.264289 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e67a9ba6-daee-4e19-bf83-d51152329c5c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.264436 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855qv\" (UniqueName: \"kubernetes.io/projected/e67a9ba6-daee-4e19-bf83-d51152329c5c-kube-api-access-855qv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e67a9ba6-daee-4e19-bf83-d51152329c5c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.264709 4807 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e67a9ba6-daee-4e19-bf83-d51152329c5c\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.283561 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855qv\" (UniqueName: \"kubernetes.io/projected/e67a9ba6-daee-4e19-bf83-d51152329c5c-kube-api-access-855qv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e67a9ba6-daee-4e19-bf83-d51152329c5c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.302510 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e67a9ba6-daee-4e19-bf83-d51152329c5c\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.350907 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.817367 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.819617 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 12:03:45 crc kubenswrapper[4807]: I1127 12:03:45.961583 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e67a9ba6-daee-4e19-bf83-d51152329c5c","Type":"ContainerStarted","Data":"dcdac4ebca567babead2f80e648e70bff932d1321e214b36ad5f523acc3a462b"} Nov 27 12:03:47 crc kubenswrapper[4807]: I1127 12:03:47.979823 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e67a9ba6-daee-4e19-bf83-d51152329c5c","Type":"ContainerStarted","Data":"8346dafd707dbc1c4cba32901dda569a51d8660495bae46e00c90d544c8453aa"} Nov 27 12:03:47 crc kubenswrapper[4807]: I1127 12:03:47.997562 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.938312561 podStartE2EDuration="3.997543414s" podCreationTimestamp="2025-11-27 12:03:44 +0000 UTC" firstStartedPulling="2025-11-27 12:03:45.819387149 +0000 UTC m=+3266.918885347" lastFinishedPulling="2025-11-27 12:03:46.878618002 +0000 UTC m=+3267.978116200" observedRunningTime="2025-11-27 12:03:47.990014833 +0000 UTC m=+3269.089513031" watchObservedRunningTime="2025-11-27 12:03:47.997543414 +0000 UTC m=+3269.097041612" Nov 27 12:03:56 crc kubenswrapper[4807]: I1127 12:03:56.532558 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:03:56 crc kubenswrapper[4807]: E1127 12:03:56.533224 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:04:09 crc kubenswrapper[4807]: I1127 12:04:09.538725 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:04:09 crc kubenswrapper[4807]: E1127 12:04:09.539572 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.189180 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kg6jg/must-gather-k626h"] Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.190911 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/must-gather-k626h" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.195296 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kg6jg"/"openshift-service-ca.crt" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.195444 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kg6jg"/"kube-root-ca.crt" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.196407 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kg6jg"/"default-dockercfg-vchfd" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.243534 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kg6jg/must-gather-k626h"] Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.370716 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxmv6\" (UniqueName: \"kubernetes.io/projected/ca489b67-7355-4cf9-a5f8-8fd359f37d63-kube-api-access-sxmv6\") pod \"must-gather-k626h\" (UID: \"ca489b67-7355-4cf9-a5f8-8fd359f37d63\") " pod="openshift-must-gather-kg6jg/must-gather-k626h" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.370780 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca489b67-7355-4cf9-a5f8-8fd359f37d63-must-gather-output\") pod \"must-gather-k626h\" (UID: \"ca489b67-7355-4cf9-a5f8-8fd359f37d63\") " pod="openshift-must-gather-kg6jg/must-gather-k626h" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.472036 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxmv6\" (UniqueName: \"kubernetes.io/projected/ca489b67-7355-4cf9-a5f8-8fd359f37d63-kube-api-access-sxmv6\") pod \"must-gather-k626h\" (UID: \"ca489b67-7355-4cf9-a5f8-8fd359f37d63\") " pod="openshift-must-gather-kg6jg/must-gather-k626h" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.472082 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca489b67-7355-4cf9-a5f8-8fd359f37d63-must-gather-output\") pod \"must-gather-k626h\" (UID: \"ca489b67-7355-4cf9-a5f8-8fd359f37d63\") " pod="openshift-must-gather-kg6jg/must-gather-k626h" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.472634 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca489b67-7355-4cf9-a5f8-8fd359f37d63-must-gather-output\") pod \"must-gather-k626h\" (UID: \"ca489b67-7355-4cf9-a5f8-8fd359f37d63\") " pod="openshift-must-gather-kg6jg/must-gather-k626h" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.505735 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxmv6\" (UniqueName: \"kubernetes.io/projected/ca489b67-7355-4cf9-a5f8-8fd359f37d63-kube-api-access-sxmv6\") pod \"must-gather-k626h\" (UID: \"ca489b67-7355-4cf9-a5f8-8fd359f37d63\") " pod="openshift-must-gather-kg6jg/must-gather-k626h" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.513656 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/must-gather-k626h" Nov 27 12:04:12 crc kubenswrapper[4807]: I1127 12:04:12.956405 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kg6jg/must-gather-k626h"] Nov 27 12:04:13 crc kubenswrapper[4807]: I1127 12:04:13.221949 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/must-gather-k626h" event={"ID":"ca489b67-7355-4cf9-a5f8-8fd359f37d63","Type":"ContainerStarted","Data":"f44d034f34ffa15023f045b8c8af079ba49a92f1b9d4eca038e99d0ce1e92c31"} Nov 27 12:04:21 crc kubenswrapper[4807]: I1127 12:04:21.304170 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/must-gather-k626h" event={"ID":"ca489b67-7355-4cf9-a5f8-8fd359f37d63","Type":"ContainerStarted","Data":"ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e"} Nov 27 12:04:21 crc kubenswrapper[4807]: I1127 12:04:21.304769 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/must-gather-k626h" event={"ID":"ca489b67-7355-4cf9-a5f8-8fd359f37d63","Type":"ContainerStarted","Data":"b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8"} Nov 27 12:04:21 crc kubenswrapper[4807]: I1127 12:04:21.322971 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kg6jg/must-gather-k626h" podStartSLOduration=2.098977808 podStartE2EDuration="9.322956508s" podCreationTimestamp="2025-11-27 12:04:12 +0000 UTC" firstStartedPulling="2025-11-27 12:04:12.970315763 +0000 UTC m=+3294.069813961" lastFinishedPulling="2025-11-27 12:04:20.194294463 +0000 UTC m=+3301.293792661" observedRunningTime="2025-11-27 12:04:21.318918731 +0000 UTC m=+3302.418416929" watchObservedRunningTime="2025-11-27 12:04:21.322956508 +0000 UTC m=+3302.422454706" Nov 27 12:04:23 crc kubenswrapper[4807]: I1127 12:04:23.920977 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kg6jg/crc-debug-nq82w"] Nov 27 12:04:23 crc kubenswrapper[4807]: I1127 12:04:23.922648 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-nq82w" Nov 27 12:04:23 crc kubenswrapper[4807]: I1127 12:04:23.986842 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ba52619-c201-4db4-ab91-b10f46b6b399-host\") pod \"crc-debug-nq82w\" (UID: \"5ba52619-c201-4db4-ab91-b10f46b6b399\") " pod="openshift-must-gather-kg6jg/crc-debug-nq82w" Nov 27 12:04:23 crc kubenswrapper[4807]: I1127 12:04:23.986997 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp7zf\" (UniqueName: \"kubernetes.io/projected/5ba52619-c201-4db4-ab91-b10f46b6b399-kube-api-access-lp7zf\") pod \"crc-debug-nq82w\" (UID: \"5ba52619-c201-4db4-ab91-b10f46b6b399\") " pod="openshift-must-gather-kg6jg/crc-debug-nq82w" Nov 27 12:04:24 crc kubenswrapper[4807]: I1127 12:04:24.088791 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp7zf\" (UniqueName: \"kubernetes.io/projected/5ba52619-c201-4db4-ab91-b10f46b6b399-kube-api-access-lp7zf\") pod \"crc-debug-nq82w\" (UID: \"5ba52619-c201-4db4-ab91-b10f46b6b399\") " pod="openshift-must-gather-kg6jg/crc-debug-nq82w" Nov 27 12:04:24 crc kubenswrapper[4807]: I1127 12:04:24.088940 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ba52619-c201-4db4-ab91-b10f46b6b399-host\") pod \"crc-debug-nq82w\" (UID: \"5ba52619-c201-4db4-ab91-b10f46b6b399\") " pod="openshift-must-gather-kg6jg/crc-debug-nq82w" Nov 27 12:04:24 crc kubenswrapper[4807]: I1127 12:04:24.089055 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ba52619-c201-4db4-ab91-b10f46b6b399-host\") pod \"crc-debug-nq82w\" (UID: \"5ba52619-c201-4db4-ab91-b10f46b6b399\") " pod="openshift-must-gather-kg6jg/crc-debug-nq82w" Nov 27 12:04:24 crc kubenswrapper[4807]: I1127 12:04:24.108025 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp7zf\" (UniqueName: \"kubernetes.io/projected/5ba52619-c201-4db4-ab91-b10f46b6b399-kube-api-access-lp7zf\") pod \"crc-debug-nq82w\" (UID: \"5ba52619-c201-4db4-ab91-b10f46b6b399\") " pod="openshift-must-gather-kg6jg/crc-debug-nq82w" Nov 27 12:04:24 crc kubenswrapper[4807]: I1127 12:04:24.241311 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-nq82w" Nov 27 12:04:24 crc kubenswrapper[4807]: W1127 12:04:24.280257 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ba52619_c201_4db4_ab91_b10f46b6b399.slice/crio-459c8c232daaef1e09262c3b5bec90b8b62806cefd42b989cc95522e9e0c4235 WatchSource:0}: Error finding container 459c8c232daaef1e09262c3b5bec90b8b62806cefd42b989cc95522e9e0c4235: Status 404 returned error can't find the container with id 459c8c232daaef1e09262c3b5bec90b8b62806cefd42b989cc95522e9e0c4235 Nov 27 12:04:24 crc kubenswrapper[4807]: I1127 12:04:24.331103 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/crc-debug-nq82w" event={"ID":"5ba52619-c201-4db4-ab91-b10f46b6b399","Type":"ContainerStarted","Data":"459c8c232daaef1e09262c3b5bec90b8b62806cefd42b989cc95522e9e0c4235"} Nov 27 12:04:24 crc kubenswrapper[4807]: I1127 12:04:24.532423 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:04:24 crc kubenswrapper[4807]: E1127 12:04:24.532715 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:04:36 crc kubenswrapper[4807]: I1127 12:04:36.532607 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:04:36 crc kubenswrapper[4807]: E1127 12:04:36.533265 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:04:37 crc kubenswrapper[4807]: I1127 12:04:37.454718 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/crc-debug-nq82w" event={"ID":"5ba52619-c201-4db4-ab91-b10f46b6b399","Type":"ContainerStarted","Data":"0808a0fecced7d6491681430acd2166138250a4546af183b5c37e041c5feadd9"} Nov 27 12:04:37 crc kubenswrapper[4807]: I1127 12:04:37.471932 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kg6jg/crc-debug-nq82w" podStartSLOduration=2.265149475 podStartE2EDuration="14.471915582s" podCreationTimestamp="2025-11-27 12:04:23 +0000 UTC" firstStartedPulling="2025-11-27 12:04:24.282239469 +0000 UTC m=+3305.381737667" lastFinishedPulling="2025-11-27 12:04:36.489005576 +0000 UTC m=+3317.588503774" observedRunningTime="2025-11-27 12:04:37.471359238 +0000 UTC m=+3318.570857446" watchObservedRunningTime="2025-11-27 12:04:37.471915582 +0000 UTC m=+3318.571413780" Nov 27 12:04:47 crc kubenswrapper[4807]: I1127 12:04:47.533779 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:04:47 crc kubenswrapper[4807]: E1127 12:04:47.534544 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:05:01 crc kubenswrapper[4807]: I1127 12:05:01.532572 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:05:01 crc kubenswrapper[4807]: E1127 12:05:01.533331 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:05:13 crc kubenswrapper[4807]: I1127 12:05:13.781355 4807 generic.go:334] "Generic (PLEG): container finished" podID="5ba52619-c201-4db4-ab91-b10f46b6b399" containerID="0808a0fecced7d6491681430acd2166138250a4546af183b5c37e041c5feadd9" exitCode=0 Nov 27 12:05:13 crc kubenswrapper[4807]: I1127 12:05:13.781412 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/crc-debug-nq82w" event={"ID":"5ba52619-c201-4db4-ab91-b10f46b6b399","Type":"ContainerDied","Data":"0808a0fecced7d6491681430acd2166138250a4546af183b5c37e041c5feadd9"} Nov 27 12:05:14 crc kubenswrapper[4807]: I1127 12:05:14.896877 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-nq82w" Nov 27 12:05:14 crc kubenswrapper[4807]: I1127 12:05:14.937682 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kg6jg/crc-debug-nq82w"] Nov 27 12:05:14 crc kubenswrapper[4807]: I1127 12:05:14.947236 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kg6jg/crc-debug-nq82w"] Nov 27 12:05:15 crc kubenswrapper[4807]: I1127 12:05:15.069769 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp7zf\" (UniqueName: \"kubernetes.io/projected/5ba52619-c201-4db4-ab91-b10f46b6b399-kube-api-access-lp7zf\") pod \"5ba52619-c201-4db4-ab91-b10f46b6b399\" (UID: \"5ba52619-c201-4db4-ab91-b10f46b6b399\") " Nov 27 12:05:15 crc kubenswrapper[4807]: I1127 12:05:15.069875 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ba52619-c201-4db4-ab91-b10f46b6b399-host\") pod \"5ba52619-c201-4db4-ab91-b10f46b6b399\" (UID: \"5ba52619-c201-4db4-ab91-b10f46b6b399\") " Nov 27 12:05:15 crc kubenswrapper[4807]: I1127 12:05:15.069963 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ba52619-c201-4db4-ab91-b10f46b6b399-host" (OuterVolumeSpecName: "host") pod "5ba52619-c201-4db4-ab91-b10f46b6b399" (UID: "5ba52619-c201-4db4-ab91-b10f46b6b399"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 12:05:15 crc kubenswrapper[4807]: I1127 12:05:15.070788 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ba52619-c201-4db4-ab91-b10f46b6b399-host\") on node \"crc\" DevicePath \"\"" Nov 27 12:05:15 crc kubenswrapper[4807]: I1127 12:05:15.081385 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba52619-c201-4db4-ab91-b10f46b6b399-kube-api-access-lp7zf" (OuterVolumeSpecName: "kube-api-access-lp7zf") pod "5ba52619-c201-4db4-ab91-b10f46b6b399" (UID: "5ba52619-c201-4db4-ab91-b10f46b6b399"). InnerVolumeSpecName "kube-api-access-lp7zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:05:15 crc kubenswrapper[4807]: I1127 12:05:15.172810 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp7zf\" (UniqueName: \"kubernetes.io/projected/5ba52619-c201-4db4-ab91-b10f46b6b399-kube-api-access-lp7zf\") on node \"crc\" DevicePath \"\"" Nov 27 12:05:15 crc kubenswrapper[4807]: I1127 12:05:15.545282 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ba52619-c201-4db4-ab91-b10f46b6b399" path="/var/lib/kubelet/pods/5ba52619-c201-4db4-ab91-b10f46b6b399/volumes" Nov 27 12:05:15 crc kubenswrapper[4807]: I1127 12:05:15.810370 4807 scope.go:117] "RemoveContainer" containerID="0808a0fecced7d6491681430acd2166138250a4546af183b5c37e041c5feadd9" Nov 27 12:05:15 crc kubenswrapper[4807]: I1127 12:05:15.810456 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-nq82w" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.160713 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kg6jg/crc-debug-4sfrq"] Nov 27 12:05:16 crc kubenswrapper[4807]: E1127 12:05:16.161953 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ba52619-c201-4db4-ab91-b10f46b6b399" containerName="container-00" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.161985 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba52619-c201-4db4-ab91-b10f46b6b399" containerName="container-00" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.162540 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ba52619-c201-4db4-ab91-b10f46b6b399" containerName="container-00" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.164083 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.296457 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qkzj\" (UniqueName: \"kubernetes.io/projected/a70372bb-f77c-44d3-b92e-58a90a0632ab-kube-api-access-4qkzj\") pod \"crc-debug-4sfrq\" (UID: \"a70372bb-f77c-44d3-b92e-58a90a0632ab\") " pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.296555 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a70372bb-f77c-44d3-b92e-58a90a0632ab-host\") pod \"crc-debug-4sfrq\" (UID: \"a70372bb-f77c-44d3-b92e-58a90a0632ab\") " pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.398638 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qkzj\" (UniqueName: \"kubernetes.io/projected/a70372bb-f77c-44d3-b92e-58a90a0632ab-kube-api-access-4qkzj\") pod \"crc-debug-4sfrq\" (UID: \"a70372bb-f77c-44d3-b92e-58a90a0632ab\") " pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.398719 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a70372bb-f77c-44d3-b92e-58a90a0632ab-host\") pod \"crc-debug-4sfrq\" (UID: \"a70372bb-f77c-44d3-b92e-58a90a0632ab\") " pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.398871 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a70372bb-f77c-44d3-b92e-58a90a0632ab-host\") pod \"crc-debug-4sfrq\" (UID: \"a70372bb-f77c-44d3-b92e-58a90a0632ab\") " pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.423078 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qkzj\" (UniqueName: \"kubernetes.io/projected/a70372bb-f77c-44d3-b92e-58a90a0632ab-kube-api-access-4qkzj\") pod \"crc-debug-4sfrq\" (UID: \"a70372bb-f77c-44d3-b92e-58a90a0632ab\") " pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.482039 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.532639 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:05:16 crc kubenswrapper[4807]: E1127 12:05:16.532948 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.820089 4807 generic.go:334] "Generic (PLEG): container finished" podID="a70372bb-f77c-44d3-b92e-58a90a0632ab" containerID="90bea4a4072d7674de18bcf1a1f6bf1afbfcf8ad6e4928a5a9fc69d1192d50af" exitCode=0 Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.820142 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" event={"ID":"a70372bb-f77c-44d3-b92e-58a90a0632ab","Type":"ContainerDied","Data":"90bea4a4072d7674de18bcf1a1f6bf1afbfcf8ad6e4928a5a9fc69d1192d50af"} Nov 27 12:05:16 crc kubenswrapper[4807]: I1127 12:05:16.820609 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" event={"ID":"a70372bb-f77c-44d3-b92e-58a90a0632ab","Type":"ContainerStarted","Data":"31f686eb56385b9f749cf9b401e7d9b08f3cbe903238c627161a80e095e8a0e1"} Nov 27 12:05:17 crc kubenswrapper[4807]: I1127 12:05:17.259721 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kg6jg/crc-debug-4sfrq"] Nov 27 12:05:17 crc kubenswrapper[4807]: I1127 12:05:17.268835 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kg6jg/crc-debug-4sfrq"] Nov 27 12:05:17 crc kubenswrapper[4807]: I1127 12:05:17.925333 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.026464 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qkzj\" (UniqueName: \"kubernetes.io/projected/a70372bb-f77c-44d3-b92e-58a90a0632ab-kube-api-access-4qkzj\") pod \"a70372bb-f77c-44d3-b92e-58a90a0632ab\" (UID: \"a70372bb-f77c-44d3-b92e-58a90a0632ab\") " Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.026602 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a70372bb-f77c-44d3-b92e-58a90a0632ab-host\") pod \"a70372bb-f77c-44d3-b92e-58a90a0632ab\" (UID: \"a70372bb-f77c-44d3-b92e-58a90a0632ab\") " Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.027135 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a70372bb-f77c-44d3-b92e-58a90a0632ab-host" (OuterVolumeSpecName: "host") pod "a70372bb-f77c-44d3-b92e-58a90a0632ab" (UID: "a70372bb-f77c-44d3-b92e-58a90a0632ab"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.033935 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a70372bb-f77c-44d3-b92e-58a90a0632ab-kube-api-access-4qkzj" (OuterVolumeSpecName: "kube-api-access-4qkzj") pod "a70372bb-f77c-44d3-b92e-58a90a0632ab" (UID: "a70372bb-f77c-44d3-b92e-58a90a0632ab"). InnerVolumeSpecName "kube-api-access-4qkzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.128564 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qkzj\" (UniqueName: \"kubernetes.io/projected/a70372bb-f77c-44d3-b92e-58a90a0632ab-kube-api-access-4qkzj\") on node \"crc\" DevicePath \"\"" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.128606 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a70372bb-f77c-44d3-b92e-58a90a0632ab-host\") on node \"crc\" DevicePath \"\"" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.425067 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kg6jg/crc-debug-ljrfq"] Nov 27 12:05:18 crc kubenswrapper[4807]: E1127 12:05:18.425466 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a70372bb-f77c-44d3-b92e-58a90a0632ab" containerName="container-00" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.425479 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70372bb-f77c-44d3-b92e-58a90a0632ab" containerName="container-00" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.425640 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a70372bb-f77c-44d3-b92e-58a90a0632ab" containerName="container-00" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.426181 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.536438 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-host\") pod \"crc-debug-ljrfq\" (UID: \"a83a9be1-7d93-43f2-b71e-c1b0b17ac812\") " pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.536491 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gx9n\" (UniqueName: \"kubernetes.io/projected/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-kube-api-access-4gx9n\") pod \"crc-debug-ljrfq\" (UID: \"a83a9be1-7d93-43f2-b71e-c1b0b17ac812\") " pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.638353 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-host\") pod \"crc-debug-ljrfq\" (UID: \"a83a9be1-7d93-43f2-b71e-c1b0b17ac812\") " pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.638417 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gx9n\" (UniqueName: \"kubernetes.io/projected/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-kube-api-access-4gx9n\") pod \"crc-debug-ljrfq\" (UID: \"a83a9be1-7d93-43f2-b71e-c1b0b17ac812\") " pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.638486 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-host\") pod \"crc-debug-ljrfq\" (UID: \"a83a9be1-7d93-43f2-b71e-c1b0b17ac812\") " pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.653745 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gx9n\" (UniqueName: \"kubernetes.io/projected/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-kube-api-access-4gx9n\") pod \"crc-debug-ljrfq\" (UID: \"a83a9be1-7d93-43f2-b71e-c1b0b17ac812\") " pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.742750 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" Nov 27 12:05:18 crc kubenswrapper[4807]: W1127 12:05:18.799264 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda83a9be1_7d93_43f2_b71e_c1b0b17ac812.slice/crio-20eb559f2d8d1dcc3dbb88423e44507c7f5dea6518f13c659926420b37d0079d WatchSource:0}: Error finding container 20eb559f2d8d1dcc3dbb88423e44507c7f5dea6518f13c659926420b37d0079d: Status 404 returned error can't find the container with id 20eb559f2d8d1dcc3dbb88423e44507c7f5dea6518f13c659926420b37d0079d Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.846518 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-4sfrq" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.846539 4807 scope.go:117] "RemoveContainer" containerID="90bea4a4072d7674de18bcf1a1f6bf1afbfcf8ad6e4928a5a9fc69d1192d50af" Nov 27 12:05:18 crc kubenswrapper[4807]: I1127 12:05:18.848340 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" event={"ID":"a83a9be1-7d93-43f2-b71e-c1b0b17ac812","Type":"ContainerStarted","Data":"20eb559f2d8d1dcc3dbb88423e44507c7f5dea6518f13c659926420b37d0079d"} Nov 27 12:05:19 crc kubenswrapper[4807]: I1127 12:05:19.546761 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a70372bb-f77c-44d3-b92e-58a90a0632ab" path="/var/lib/kubelet/pods/a70372bb-f77c-44d3-b92e-58a90a0632ab/volumes" Nov 27 12:05:19 crc kubenswrapper[4807]: I1127 12:05:19.858543 4807 generic.go:334] "Generic (PLEG): container finished" podID="a83a9be1-7d93-43f2-b71e-c1b0b17ac812" containerID="85480b13ce2291724da3e0dddd30e76eabad9a293841f320a0c922dc7de52ed6" exitCode=0 Nov 27 12:05:19 crc kubenswrapper[4807]: I1127 12:05:19.858602 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" event={"ID":"a83a9be1-7d93-43f2-b71e-c1b0b17ac812","Type":"ContainerDied","Data":"85480b13ce2291724da3e0dddd30e76eabad9a293841f320a0c922dc7de52ed6"} Nov 27 12:05:19 crc kubenswrapper[4807]: I1127 12:05:19.901619 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kg6jg/crc-debug-ljrfq"] Nov 27 12:05:19 crc kubenswrapper[4807]: I1127 12:05:19.908647 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kg6jg/crc-debug-ljrfq"] Nov 27 12:05:20 crc kubenswrapper[4807]: I1127 12:05:20.974769 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" Nov 27 12:05:21 crc kubenswrapper[4807]: I1127 12:05:21.091103 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-host\") pod \"a83a9be1-7d93-43f2-b71e-c1b0b17ac812\" (UID: \"a83a9be1-7d93-43f2-b71e-c1b0b17ac812\") " Nov 27 12:05:21 crc kubenswrapper[4807]: I1127 12:05:21.091416 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gx9n\" (UniqueName: \"kubernetes.io/projected/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-kube-api-access-4gx9n\") pod \"a83a9be1-7d93-43f2-b71e-c1b0b17ac812\" (UID: \"a83a9be1-7d93-43f2-b71e-c1b0b17ac812\") " Nov 27 12:05:21 crc kubenswrapper[4807]: I1127 12:05:21.092023 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-host" (OuterVolumeSpecName: "host") pod "a83a9be1-7d93-43f2-b71e-c1b0b17ac812" (UID: "a83a9be1-7d93-43f2-b71e-c1b0b17ac812"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 12:05:21 crc kubenswrapper[4807]: I1127 12:05:21.097542 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-kube-api-access-4gx9n" (OuterVolumeSpecName: "kube-api-access-4gx9n") pod "a83a9be1-7d93-43f2-b71e-c1b0b17ac812" (UID: "a83a9be1-7d93-43f2-b71e-c1b0b17ac812"). InnerVolumeSpecName "kube-api-access-4gx9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:05:21 crc kubenswrapper[4807]: I1127 12:05:21.193188 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-host\") on node \"crc\" DevicePath \"\"" Nov 27 12:05:21 crc kubenswrapper[4807]: I1127 12:05:21.193490 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gx9n\" (UniqueName: \"kubernetes.io/projected/a83a9be1-7d93-43f2-b71e-c1b0b17ac812-kube-api-access-4gx9n\") on node \"crc\" DevicePath \"\"" Nov 27 12:05:21 crc kubenswrapper[4807]: I1127 12:05:21.544098 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a83a9be1-7d93-43f2-b71e-c1b0b17ac812" path="/var/lib/kubelet/pods/a83a9be1-7d93-43f2-b71e-c1b0b17ac812/volumes" Nov 27 12:05:21 crc kubenswrapper[4807]: I1127 12:05:21.891845 4807 scope.go:117] "RemoveContainer" containerID="85480b13ce2291724da3e0dddd30e76eabad9a293841f320a0c922dc7de52ed6" Nov 27 12:05:21 crc kubenswrapper[4807]: I1127 12:05:21.891984 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/crc-debug-ljrfq" Nov 27 12:05:27 crc kubenswrapper[4807]: I1127 12:05:27.532552 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:05:27 crc kubenswrapper[4807]: E1127 12:05:27.533719 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.282047 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2gsnh"] Nov 27 12:05:32 crc kubenswrapper[4807]: E1127 12:05:32.284446 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a83a9be1-7d93-43f2-b71e-c1b0b17ac812" containerName="container-00" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.284592 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="a83a9be1-7d93-43f2-b71e-c1b0b17ac812" containerName="container-00" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.284936 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="a83a9be1-7d93-43f2-b71e-c1b0b17ac812" containerName="container-00" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.286714 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.292674 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2gsnh"] Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.314374 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-catalog-content\") pod \"redhat-operators-2gsnh\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.314427 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h2wb\" (UniqueName: \"kubernetes.io/projected/ba2396e1-0743-49ed-a665-1aef889ee983-kube-api-access-7h2wb\") pod \"redhat-operators-2gsnh\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.314698 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-utilities\") pod \"redhat-operators-2gsnh\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.417183 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-utilities\") pod \"redhat-operators-2gsnh\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.417573 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-catalog-content\") pod \"redhat-operators-2gsnh\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.417670 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7h2wb\" (UniqueName: \"kubernetes.io/projected/ba2396e1-0743-49ed-a665-1aef889ee983-kube-api-access-7h2wb\") pod \"redhat-operators-2gsnh\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.417943 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-catalog-content\") pod \"redhat-operators-2gsnh\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.417994 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-utilities\") pod \"redhat-operators-2gsnh\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.440600 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h2wb\" (UniqueName: \"kubernetes.io/projected/ba2396e1-0743-49ed-a665-1aef889ee983-kube-api-access-7h2wb\") pod \"redhat-operators-2gsnh\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:32 crc kubenswrapper[4807]: I1127 12:05:32.610224 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:33 crc kubenswrapper[4807]: I1127 12:05:33.143753 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2gsnh"] Nov 27 12:05:33 crc kubenswrapper[4807]: I1127 12:05:33.993208 4807 generic.go:334] "Generic (PLEG): container finished" podID="ba2396e1-0743-49ed-a665-1aef889ee983" containerID="8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a" exitCode=0 Nov 27 12:05:33 crc kubenswrapper[4807]: I1127 12:05:33.993283 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gsnh" event={"ID":"ba2396e1-0743-49ed-a665-1aef889ee983","Type":"ContainerDied","Data":"8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a"} Nov 27 12:05:33 crc kubenswrapper[4807]: I1127 12:05:33.994481 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gsnh" event={"ID":"ba2396e1-0743-49ed-a665-1aef889ee983","Type":"ContainerStarted","Data":"55bda5526bc0fd6b0968566f1f8e011f5c55aa5d02166c5c1a85c899a092e626"} Nov 27 12:05:34 crc kubenswrapper[4807]: I1127 12:05:34.834266 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b4c869746-crx9p_b3ebef31-c3b4-4d86-96b1-92bb2038fcc2/barbican-api/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.002012 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b4c869746-crx9p_b3ebef31-c3b4-4d86-96b1-92bb2038fcc2/barbican-api-log/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.027415 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67945bfd5d-wnmj5_25b00dcc-1d4a-4d61-9865-db7b0515e360/barbican-keystone-listener/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.187558 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67945bfd5d-wnmj5_25b00dcc-1d4a-4d61-9865-db7b0515e360/barbican-keystone-listener-log/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.190971 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5df9b5c779-cqvbn_b22f0add-3876-4db6-a6ac-83bf95c37ea6/barbican-worker/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.240051 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5df9b5c779-cqvbn_b22f0add-3876-4db6-a6ac-83bf95c37ea6/barbican-worker-log/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.384478 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb_a94d3cc9-c680-4f54-a2b6-0f55690f4cfa/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.493576 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2310e932-c289-4fe8-a5f9-ee9ce3ce915b/ceilometer-central-agent/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.624641 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2310e932-c289-4fe8-a5f9-ee9ce3ce915b/proxy-httpd/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.686969 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2310e932-c289-4fe8-a5f9-ee9ce3ce915b/ceilometer-notification-agent/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.760929 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2310e932-c289-4fe8-a5f9-ee9ce3ce915b/sg-core/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.881301 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_722777ce-cfa6-4b7d-96ba-452a6998356d/cinder-api/0.log" Nov 27 12:05:35 crc kubenswrapper[4807]: I1127 12:05:35.920429 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_722777ce-cfa6-4b7d-96ba-452a6998356d/cinder-api-log/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.006678 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_eb01b182-7b19-44ea-874b-3ad6a1ebb6a7/cinder-scheduler/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.012083 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gsnh" event={"ID":"ba2396e1-0743-49ed-a665-1aef889ee983","Type":"ContainerStarted","Data":"a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383"} Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.094754 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_eb01b182-7b19-44ea-874b-3ad6a1ebb6a7/probe/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.262220 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-64std_193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.308837 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw_369043db-4f00-4bbd-ab16-6d8f27564af2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.459100 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-956t7_6f7c0ed3-d807-4035-b4c9-a2f906d06c46/init/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.585042 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-956t7_6f7c0ed3-d807-4035-b4c9-a2f906d06c46/init/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.597782 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-956t7_6f7c0ed3-d807-4035-b4c9-a2f906d06c46/dnsmasq-dns/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.668059 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x_7cfe00fa-307e-460b-a77e-a57439954c87/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.808558 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0a059d86-8a32-481a-80c7-e9675cb921b9/glance-httpd/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.866274 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0a059d86-8a32-481a-80c7-e9675cb921b9/glance-log/0.log" Nov 27 12:05:36 crc kubenswrapper[4807]: I1127 12:05:36.989124 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_04b42996-10c7-401c-b91b-e0ab4e100173/glance-log/0.log" Nov 27 12:05:37 crc kubenswrapper[4807]: I1127 12:05:37.020114 4807 generic.go:334] "Generic (PLEG): container finished" podID="ba2396e1-0743-49ed-a665-1aef889ee983" containerID="a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383" exitCode=0 Nov 27 12:05:37 crc kubenswrapper[4807]: I1127 12:05:37.020156 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gsnh" event={"ID":"ba2396e1-0743-49ed-a665-1aef889ee983","Type":"ContainerDied","Data":"a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383"} Nov 27 12:05:37 crc kubenswrapper[4807]: I1127 12:05:37.041691 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_04b42996-10c7-401c-b91b-e0ab4e100173/glance-httpd/0.log" Nov 27 12:05:37 crc kubenswrapper[4807]: I1127 12:05:37.293468 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2_e50793ea-c215-407b-ac8f-a5767166a0dd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:37 crc kubenswrapper[4807]: I1127 12:05:37.309114 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d69cff6fb-88t5t_ba9d500c-ec74-4755-924d-8b6160bb51dc/horizon/0.log" Nov 27 12:05:37 crc kubenswrapper[4807]: I1127 12:05:37.470328 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d69cff6fb-88t5t_ba9d500c-ec74-4755-924d-8b6160bb51dc/horizon-log/0.log" Nov 27 12:05:37 crc kubenswrapper[4807]: I1127 12:05:37.635920 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xc7q6_b0775b68-e606-412d-a9b9-1f8eb98bbd63/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:37 crc kubenswrapper[4807]: I1127 12:05:37.806635 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-54b9d76d5d-4mfvr_bc71ab7b-e861-46eb-ab9e-e45a4aafd76b/keystone-api/0.log" Nov 27 12:05:37 crc kubenswrapper[4807]: I1127 12:05:37.942472 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2744cb30-46c9-4f1e-a771-9bd30eefa50d/kube-state-metrics/0.log" Nov 27 12:05:38 crc kubenswrapper[4807]: I1127 12:05:38.036858 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29404081-8hnvz_d0a23453-ffa9-450e-a401-8f3c4a917196/keystone-cron/0.log" Nov 27 12:05:38 crc kubenswrapper[4807]: I1127 12:05:38.086097 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm_36b0f83c-c6d3-4d4b-9675-478b3f02f952/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:38 crc kubenswrapper[4807]: I1127 12:05:38.434120 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-ff549ff99-zxxvk_311f3fc5-b5ab-4fd9-8146-7442b0b29409/neutron-api/0.log" Nov 27 12:05:38 crc kubenswrapper[4807]: I1127 12:05:38.502439 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-ff549ff99-zxxvk_311f3fc5-b5ab-4fd9-8146-7442b0b29409/neutron-httpd/0.log" Nov 27 12:05:38 crc kubenswrapper[4807]: I1127 12:05:38.619658 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx_2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:39 crc kubenswrapper[4807]: I1127 12:05:39.127222 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a071e484-2dfb-4bef-a538-69770c7f5f56/nova-api-log/0.log" Nov 27 12:05:39 crc kubenswrapper[4807]: I1127 12:05:39.129407 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_621dbc60-ba00-466f-8cbb-2e58611dff37/nova-cell0-conductor-conductor/0.log" Nov 27 12:05:39 crc kubenswrapper[4807]: I1127 12:05:39.278842 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a071e484-2dfb-4bef-a538-69770c7f5f56/nova-api-api/0.log" Nov 27 12:05:39 crc kubenswrapper[4807]: I1127 12:05:39.499051 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2f104b79-cd6b-4d1b-9ad9-a508e5ec636b/nova-cell1-conductor-conductor/0.log" Nov 27 12:05:39 crc kubenswrapper[4807]: I1127 12:05:39.513709 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_148fe221-9289-4661-9ac6-fa5eb6af9b7f/nova-cell1-novncproxy-novncproxy/0.log" Nov 27 12:05:39 crc kubenswrapper[4807]: I1127 12:05:39.751064 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-xnswz_08c2cd76-cfdb-4de6-ac04-8925b75415fa/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:39 crc kubenswrapper[4807]: I1127 12:05:39.955785 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_088b7a89-396a-434c-b201-a7ecb96cb2e7/nova-metadata-log/0.log" Nov 27 12:05:40 crc kubenswrapper[4807]: I1127 12:05:40.048224 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gsnh" event={"ID":"ba2396e1-0743-49ed-a665-1aef889ee983","Type":"ContainerStarted","Data":"d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce"} Nov 27 12:05:40 crc kubenswrapper[4807]: I1127 12:05:40.074525 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2gsnh" podStartSLOduration=3.074598779 podStartE2EDuration="8.074501333s" podCreationTimestamp="2025-11-27 12:05:32 +0000 UTC" firstStartedPulling="2025-11-27 12:05:33.99505913 +0000 UTC m=+3375.094557328" lastFinishedPulling="2025-11-27 12:05:38.994961684 +0000 UTC m=+3380.094459882" observedRunningTime="2025-11-27 12:05:40.063556611 +0000 UTC m=+3381.163054819" watchObservedRunningTime="2025-11-27 12:05:40.074501333 +0000 UTC m=+3381.173999541" Nov 27 12:05:40 crc kubenswrapper[4807]: I1127 12:05:40.173770 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_39c3409e-49b1-4dfd-ba16-005e9f6e5a44/nova-scheduler-scheduler/0.log" Nov 27 12:05:40 crc kubenswrapper[4807]: I1127 12:05:40.328374 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6603c2ee-9ab6-476c-8db6-d073f0dec3aa/mysql-bootstrap/0.log" Nov 27 12:05:40 crc kubenswrapper[4807]: I1127 12:05:40.531913 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:05:40 crc kubenswrapper[4807]: E1127 12:05:40.532143 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:05:40 crc kubenswrapper[4807]: I1127 12:05:40.560693 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6603c2ee-9ab6-476c-8db6-d073f0dec3aa/mysql-bootstrap/0.log" Nov 27 12:05:40 crc kubenswrapper[4807]: I1127 12:05:40.593330 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6603c2ee-9ab6-476c-8db6-d073f0dec3aa/galera/0.log" Nov 27 12:05:40 crc kubenswrapper[4807]: I1127 12:05:40.863171 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_122d837d-ee30-4e26-9e01-1f4bd8ebaace/mysql-bootstrap/0.log" Nov 27 12:05:40 crc kubenswrapper[4807]: I1127 12:05:40.942009 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_088b7a89-396a-434c-b201-a7ecb96cb2e7/nova-metadata-metadata/0.log" Nov 27 12:05:40 crc kubenswrapper[4807]: I1127 12:05:40.993502 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_122d837d-ee30-4e26-9e01-1f4bd8ebaace/galera/0.log" Nov 27 12:05:41 crc kubenswrapper[4807]: I1127 12:05:41.062791 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_122d837d-ee30-4e26-9e01-1f4bd8ebaace/mysql-bootstrap/0.log" Nov 27 12:05:41 crc kubenswrapper[4807]: I1127 12:05:41.152841 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_99f92409-b35d-4905-bdac-488235b8c054/openstackclient/0.log" Nov 27 12:05:41 crc kubenswrapper[4807]: I1127 12:05:41.326732 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-64nw4_356f01bb-6304-499b-946d-1e9f3d6e7572/ovn-controller/0.log" Nov 27 12:05:41 crc kubenswrapper[4807]: I1127 12:05:41.385512 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6pfcf_186b6a8f-d303-440b-99ea-6502bac3e583/openstack-network-exporter/0.log" Nov 27 12:05:41 crc kubenswrapper[4807]: I1127 12:05:41.525503 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-26rzj_9bac140b-1bac-4d27-bb66-111e66af1edf/ovsdb-server-init/0.log" Nov 27 12:05:41 crc kubenswrapper[4807]: I1127 12:05:41.717554 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-26rzj_9bac140b-1bac-4d27-bb66-111e66af1edf/ovsdb-server-init/0.log" Nov 27 12:05:41 crc kubenswrapper[4807]: I1127 12:05:41.789916 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-26rzj_9bac140b-1bac-4d27-bb66-111e66af1edf/ovsdb-server/0.log" Nov 27 12:05:41 crc kubenswrapper[4807]: I1127 12:05:41.802843 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-26rzj_9bac140b-1bac-4d27-bb66-111e66af1edf/ovs-vswitchd/0.log" Nov 27 12:05:41 crc kubenswrapper[4807]: I1127 12:05:41.944642 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-b2bfj_ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.013691 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f2270de6-a69c-44be-8fb7-98e10027cd34/openstack-network-exporter/0.log" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.093410 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f2270de6-a69c-44be-8fb7-98e10027cd34/ovn-northd/0.log" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.312921 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0a8b97df-a50b-4cce-8035-28b23cbdaf72/ovsdbserver-nb/0.log" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.329706 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0a8b97df-a50b-4cce-8035-28b23cbdaf72/openstack-network-exporter/0.log" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.431925 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e47ac50c-3a93-46fd-94f2-5c83e02e1919/openstack-network-exporter/0.log" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.501819 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e47ac50c-3a93-46fd-94f2-5c83e02e1919/ovsdbserver-sb/0.log" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.610422 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.610484 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.686402 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5449cc7d8-rpm6t_e4fc9fe4-54f1-458b-b2f7-ff20982e3243/placement-api/0.log" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.833354 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5449cc7d8-rpm6t_e4fc9fe4-54f1-458b-b2f7-ff20982e3243/placement-log/0.log" Nov 27 12:05:42 crc kubenswrapper[4807]: I1127 12:05:42.900859 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c679115a-3605-4e24-8d75-553d53d87f48/setup-container/0.log" Nov 27 12:05:43 crc kubenswrapper[4807]: I1127 12:05:43.096527 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c679115a-3605-4e24-8d75-553d53d87f48/setup-container/0.log" Nov 27 12:05:43 crc kubenswrapper[4807]: I1127 12:05:43.117078 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c679115a-3605-4e24-8d75-553d53d87f48/rabbitmq/0.log" Nov 27 12:05:43 crc kubenswrapper[4807]: I1127 12:05:43.188574 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2dc733a-0951-4580-a301-d0dd7d7937f1/setup-container/0.log" Nov 27 12:05:43 crc kubenswrapper[4807]: I1127 12:05:43.429970 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2dc733a-0951-4580-a301-d0dd7d7937f1/rabbitmq/0.log" Nov 27 12:05:43 crc kubenswrapper[4807]: I1127 12:05:43.446387 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2dc733a-0951-4580-a301-d0dd7d7937f1/setup-container/0.log" Nov 27 12:05:43 crc kubenswrapper[4807]: I1127 12:05:43.463784 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb_cff69888-3585-4127-a2e6-122a7fdfe894/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:43 crc kubenswrapper[4807]: I1127 12:05:43.687284 4807 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2gsnh" podUID="ba2396e1-0743-49ed-a665-1aef889ee983" containerName="registry-server" probeResult="failure" output=< Nov 27 12:05:43 crc kubenswrapper[4807]: timeout: failed to connect service ":50051" within 1s Nov 27 12:05:43 crc kubenswrapper[4807]: > Nov 27 12:05:43 crc kubenswrapper[4807]: I1127 12:05:43.734851 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-xwp2h_edca8731-6d7e-44e5-b2a3-8622578409df/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:43 crc kubenswrapper[4807]: I1127 12:05:43.882124 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh_c530987e-af49-45dc-ae6e-13c19df75606/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:43 crc kubenswrapper[4807]: I1127 12:05:43.891863 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bdw67_345077b8-ac19-43cb-8eee-e6112034320c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:44 crc kubenswrapper[4807]: I1127 12:05:44.164670 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x8578_f628332e-750f-45bd-994e-fcd01490e1e5/ssh-known-hosts-edpm-deployment/0.log" Nov 27 12:05:44 crc kubenswrapper[4807]: I1127 12:05:44.382595 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b6fc97755-xnlzr_ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257/proxy-httpd/0.log" Nov 27 12:05:44 crc kubenswrapper[4807]: I1127 12:05:44.571877 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b6fc97755-xnlzr_ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257/proxy-server/0.log" Nov 27 12:05:44 crc kubenswrapper[4807]: I1127 12:05:44.627805 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-6hg29_b83dff2f-801e-4a9b-9427-48e1f51bcc79/swift-ring-rebalance/0.log" Nov 27 12:05:44 crc kubenswrapper[4807]: I1127 12:05:44.750909 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/account-auditor/0.log" Nov 27 12:05:44 crc kubenswrapper[4807]: I1127 12:05:44.824970 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/account-reaper/0.log" Nov 27 12:05:44 crc kubenswrapper[4807]: I1127 12:05:44.864579 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/account-replicator/0.log" Nov 27 12:05:44 crc kubenswrapper[4807]: I1127 12:05:44.947670 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/account-server/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.010110 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/container-auditor/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.031400 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/container-replicator/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.078564 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/container-server/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.164545 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/container-updater/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.216824 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/object-auditor/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.314080 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/object-expirer/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.353007 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/object-replicator/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.361622 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/object-server/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.505818 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/object-updater/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.508572 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/rsync/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.612415 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/swift-recon-cron/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.750710 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh_8a22c7d6-438a-499e-80d0-384ea7d2ec15/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.800498 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_203f3a06-5cde-4778-837a-90fbfde39772/tempest-tests-tempest-tests-runner/0.log" Nov 27 12:05:45 crc kubenswrapper[4807]: I1127 12:05:45.948735 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e67a9ba6-daee-4e19-bf83-d51152329c5c/test-operator-logs-container/0.log" Nov 27 12:05:46 crc kubenswrapper[4807]: I1127 12:05:46.024865 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-lt95g_e971f91a-7313-4149-af78-554da58f81e1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:05:52 crc kubenswrapper[4807]: I1127 12:05:52.673627 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:52 crc kubenswrapper[4807]: I1127 12:05:52.734012 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:52 crc kubenswrapper[4807]: I1127 12:05:52.910368 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2gsnh"] Nov 27 12:05:53 crc kubenswrapper[4807]: I1127 12:05:53.394595 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8ecce491-4a06-4922-8353-0586ac99471b/memcached/0.log" Nov 27 12:05:54 crc kubenswrapper[4807]: I1127 12:05:54.176765 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2gsnh" podUID="ba2396e1-0743-49ed-a665-1aef889ee983" containerName="registry-server" containerID="cri-o://d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce" gracePeriod=2 Nov 27 12:05:54 crc kubenswrapper[4807]: I1127 12:05:54.532400 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:05:54 crc kubenswrapper[4807]: E1127 12:05:54.532909 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:05:54 crc kubenswrapper[4807]: I1127 12:05:54.756720 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:54 crc kubenswrapper[4807]: I1127 12:05:54.929241 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-utilities\") pod \"ba2396e1-0743-49ed-a665-1aef889ee983\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " Nov 27 12:05:54 crc kubenswrapper[4807]: I1127 12:05:54.929341 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-catalog-content\") pod \"ba2396e1-0743-49ed-a665-1aef889ee983\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " Nov 27 12:05:54 crc kubenswrapper[4807]: I1127 12:05:54.929513 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7h2wb\" (UniqueName: \"kubernetes.io/projected/ba2396e1-0743-49ed-a665-1aef889ee983-kube-api-access-7h2wb\") pod \"ba2396e1-0743-49ed-a665-1aef889ee983\" (UID: \"ba2396e1-0743-49ed-a665-1aef889ee983\") " Nov 27 12:05:54 crc kubenswrapper[4807]: I1127 12:05:54.930022 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-utilities" (OuterVolumeSpecName: "utilities") pod "ba2396e1-0743-49ed-a665-1aef889ee983" (UID: "ba2396e1-0743-49ed-a665-1aef889ee983"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:05:54 crc kubenswrapper[4807]: I1127 12:05:54.935505 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2396e1-0743-49ed-a665-1aef889ee983-kube-api-access-7h2wb" (OuterVolumeSpecName: "kube-api-access-7h2wb") pod "ba2396e1-0743-49ed-a665-1aef889ee983" (UID: "ba2396e1-0743-49ed-a665-1aef889ee983"). InnerVolumeSpecName "kube-api-access-7h2wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.025043 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba2396e1-0743-49ed-a665-1aef889ee983" (UID: "ba2396e1-0743-49ed-a665-1aef889ee983"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.032099 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7h2wb\" (UniqueName: \"kubernetes.io/projected/ba2396e1-0743-49ed-a665-1aef889ee983-kube-api-access-7h2wb\") on node \"crc\" DevicePath \"\"" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.032134 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.032145 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba2396e1-0743-49ed-a665-1aef889ee983-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.187061 4807 generic.go:334] "Generic (PLEG): container finished" podID="ba2396e1-0743-49ed-a665-1aef889ee983" containerID="d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce" exitCode=0 Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.187102 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gsnh" event={"ID":"ba2396e1-0743-49ed-a665-1aef889ee983","Type":"ContainerDied","Data":"d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce"} Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.187139 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2gsnh" event={"ID":"ba2396e1-0743-49ed-a665-1aef889ee983","Type":"ContainerDied","Data":"55bda5526bc0fd6b0968566f1f8e011f5c55aa5d02166c5c1a85c899a092e626"} Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.187156 4807 scope.go:117] "RemoveContainer" containerID="d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.187292 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2gsnh" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.222679 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2gsnh"] Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.223820 4807 scope.go:117] "RemoveContainer" containerID="a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.230572 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2gsnh"] Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.250619 4807 scope.go:117] "RemoveContainer" containerID="8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.294764 4807 scope.go:117] "RemoveContainer" containerID="d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce" Nov 27 12:05:55 crc kubenswrapper[4807]: E1127 12:05:55.295105 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce\": container with ID starting with d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce not found: ID does not exist" containerID="d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.295146 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce"} err="failed to get container status \"d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce\": rpc error: code = NotFound desc = could not find container \"d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce\": container with ID starting with d3459b4e4d3a23c37d025ce9ba54d1848811d03c9daf8c182cf6b51bb2cdccce not found: ID does not exist" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.295172 4807 scope.go:117] "RemoveContainer" containerID="a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383" Nov 27 12:05:55 crc kubenswrapper[4807]: E1127 12:05:55.295395 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383\": container with ID starting with a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383 not found: ID does not exist" containerID="a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.295418 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383"} err="failed to get container status \"a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383\": rpc error: code = NotFound desc = could not find container \"a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383\": container with ID starting with a29be76a46c27921e5ab5a4b5ba2b02b6b3a4d93f5ca0681bbd37ec343911383 not found: ID does not exist" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.295436 4807 scope.go:117] "RemoveContainer" containerID="8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a" Nov 27 12:05:55 crc kubenswrapper[4807]: E1127 12:05:55.295749 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a\": container with ID starting with 8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a not found: ID does not exist" containerID="8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.295775 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a"} err="failed to get container status \"8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a\": rpc error: code = NotFound desc = could not find container \"8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a\": container with ID starting with 8b26e74345ecb173c96226d30ab09cd1c1e5e0d7409a758fea80ec6a6f9b485a not found: ID does not exist" Nov 27 12:05:55 crc kubenswrapper[4807]: I1127 12:05:55.542934 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2396e1-0743-49ed-a665-1aef889ee983" path="/var/lib/kubelet/pods/ba2396e1-0743-49ed-a665-1aef889ee983/volumes" Nov 27 12:06:09 crc kubenswrapper[4807]: I1127 12:06:09.538587 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:06:09 crc kubenswrapper[4807]: E1127 12:06:09.539398 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:06:09 crc kubenswrapper[4807]: I1127 12:06:09.797795 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-z6w5s_7377040f-fbf5-4395-a903-99dbb10dbcac/kube-rbac-proxy/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.050588 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-z6w5s_7377040f-fbf5-4395-a903-99dbb10dbcac/manager/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.188939 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-6sqtw_623644bf-2d87-4689-acea-cfaeca90285f/kube-rbac-proxy/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.236113 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-6sqtw_623644bf-2d87-4689-acea-cfaeca90285f/manager/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.385421 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/util/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.548078 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/pull/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.605730 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/util/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.630048 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/pull/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.753186 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/util/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.771508 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/extract/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.798290 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/pull/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.943919 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-5wfgl_574b2edd-5058-4d84-a8b8-72258c3c9f7b/kube-rbac-proxy/0.log" Nov 27 12:06:10 crc kubenswrapper[4807]: I1127 12:06:10.973465 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-5wfgl_574b2edd-5058-4d84-a8b8-72258c3c9f7b/manager/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.035480 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-v9d6j_5ae030e1-b973-4137-abd1-1abc5f5d1153/kube-rbac-proxy/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.209672 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-v9d6j_5ae030e1-b973-4137-abd1-1abc5f5d1153/manager/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.216980 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-j2tq6_c5b9cfda-ea17-4add-a121-036a989efeab/kube-rbac-proxy/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.244054 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-j2tq6_c5b9cfda-ea17-4add-a121-036a989efeab/manager/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.391824 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-fztl6_4ae17b3e-8de9-45e3-8404-2f2fda6c6b99/kube-rbac-proxy/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.428612 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-fztl6_4ae17b3e-8de9-45e3-8404-2f2fda6c6b99/manager/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.581766 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-qg4bq_4be44e13-06b8-494e-8a62-7e8d8747692f/kube-rbac-proxy/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.642959 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-xtdbs_9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b/kube-rbac-proxy/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.754517 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-qg4bq_4be44e13-06b8-494e-8a62-7e8d8747692f/manager/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.821587 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-xtdbs_9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b/manager/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.842846 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-lz6lg_af2f67ab-040b-4ec1-bf21-db83dcaeb6d2/kube-rbac-proxy/0.log" Nov 27 12:06:11 crc kubenswrapper[4807]: I1127 12:06:11.965139 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-hcnzc_3ae6d3a5-8999-4c3d-a3de-b497ae0776f2/kube-rbac-proxy/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.034362 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-lz6lg_af2f67ab-040b-4ec1-bf21-db83dcaeb6d2/manager/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.042605 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-hcnzc_3ae6d3a5-8999-4c3d-a3de-b497ae0776f2/manager/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.147570 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-5wb7l_fe4ff55b-a2dd-4936-9016-d73ade2388a0/kube-rbac-proxy/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.218409 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-5wb7l_fe4ff55b-a2dd-4936-9016-d73ade2388a0/manager/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.433306 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-xfrls_961001c9-3719-4306-8d38-b3c5d8e202bc/manager/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.436374 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-xfrls_961001c9-3719-4306-8d38-b3c5d8e202bc/kube-rbac-proxy/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.497976 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-6l4tm_dcfd531a-2394-41c7-b05a-5b8e95f8459c/kube-rbac-proxy/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.689546 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-k82xf_1b313bc2-c896-486c-a520-9843ec7bd6ad/kube-rbac-proxy/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.694746 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-6l4tm_dcfd531a-2394-41c7-b05a-5b8e95f8459c/manager/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.712544 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-k82xf_1b313bc2-c896-486c-a520-9843ec7bd6ad/manager/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.885374 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2_0bde1253-53c0-4864-b22e-dcf25751388e/kube-rbac-proxy/0.log" Nov 27 12:06:12 crc kubenswrapper[4807]: I1127 12:06:12.899486 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2_0bde1253-53c0-4864-b22e-dcf25751388e/manager/0.log" Nov 27 12:06:13 crc kubenswrapper[4807]: I1127 12:06:13.359258 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-59f78dbdf9-fzjdb_ffe745e4-da98-4391-990b-a86d2fbc3346/operator/0.log" Nov 27 12:06:13 crc kubenswrapper[4807]: I1127 12:06:13.405737 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-z2sjb_c9606905-c5fd-4ebe-9942-b013364d7ca8/registry-server/0.log" Nov 27 12:06:13 crc kubenswrapper[4807]: I1127 12:06:13.542927 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-d9mj8_787c342b-413f-495a-8b31-bd8a01f35c3a/kube-rbac-proxy/0.log" Nov 27 12:06:13 crc kubenswrapper[4807]: I1127 12:06:13.703191 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-d9mj8_787c342b-413f-495a-8b31-bd8a01f35c3a/manager/0.log" Nov 27 12:06:13 crc kubenswrapper[4807]: I1127 12:06:13.819019 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-cw9bt_a0e8d2a3-0f58-4a1d-9867-648001196d2e/kube-rbac-proxy/0.log" Nov 27 12:06:13 crc kubenswrapper[4807]: I1127 12:06:13.853961 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-cw9bt_a0e8d2a3-0f58-4a1d-9867-648001196d2e/manager/0.log" Nov 27 12:06:14 crc kubenswrapper[4807]: I1127 12:06:14.085050 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-l72vc_8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91/operator/0.log" Nov 27 12:06:14 crc kubenswrapper[4807]: I1127 12:06:14.117363 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-pd7pj_94062b2f-3f5a-404d-9b0a-8b7f858e1322/kube-rbac-proxy/0.log" Nov 27 12:06:14 crc kubenswrapper[4807]: I1127 12:06:14.153034 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6456fcdb48-tjnrt_649aedb9-ad77-47fa-a7e9-89cb12c65928/manager/0.log" Nov 27 12:06:14 crc kubenswrapper[4807]: I1127 12:06:14.169189 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-pd7pj_94062b2f-3f5a-404d-9b0a-8b7f858e1322/manager/0.log" Nov 27 12:06:14 crc kubenswrapper[4807]: I1127 12:06:14.307094 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-svq2j_5b554316-8e33-4fa8-a340-91d9e0f6b0de/kube-rbac-proxy/0.log" Nov 27 12:06:14 crc kubenswrapper[4807]: I1127 12:06:14.362377 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-mw4mw_7fbca001-90e9-4da2-bd14-6bc00a48ed40/kube-rbac-proxy/0.log" Nov 27 12:06:14 crc kubenswrapper[4807]: I1127 12:06:14.395274 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-svq2j_5b554316-8e33-4fa8-a340-91d9e0f6b0de/manager/0.log" Nov 27 12:06:14 crc kubenswrapper[4807]: I1127 12:06:14.476522 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-mw4mw_7fbca001-90e9-4da2-bd14-6bc00a48ed40/manager/0.log" Nov 27 12:06:14 crc kubenswrapper[4807]: I1127 12:06:14.579751 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-tj4db_13749acc-f727-4c3a-b24a-b56bd6b7533d/kube-rbac-proxy/0.log" Nov 27 12:06:14 crc kubenswrapper[4807]: I1127 12:06:14.585448 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-tj4db_13749acc-f727-4c3a-b24a-b56bd6b7533d/manager/0.log" Nov 27 12:06:23 crc kubenswrapper[4807]: I1127 12:06:23.532135 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:06:24 crc kubenswrapper[4807]: I1127 12:06:24.431299 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"e6ad55be444bb836094e989710fe293e7e4b3c2a590dc0117eb67fcd7f9ef509"} Nov 27 12:06:33 crc kubenswrapper[4807]: I1127 12:06:33.462285 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xrg5s_8a96dd38-5283-4cea-a3d4-623c6a5191a6/control-plane-machine-set-operator/0.log" Nov 27 12:06:33 crc kubenswrapper[4807]: I1127 12:06:33.606267 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pvv9r_d0fee666-2d95-4330-a8aa-4ab1ca30bb5f/kube-rbac-proxy/0.log" Nov 27 12:06:33 crc kubenswrapper[4807]: I1127 12:06:33.646173 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pvv9r_d0fee666-2d95-4330-a8aa-4ab1ca30bb5f/machine-api-operator/0.log" Nov 27 12:06:45 crc kubenswrapper[4807]: I1127 12:06:45.972693 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-czgf5_bc3f48f0-2d12-4c07-bfb4-20914aeaf910/cert-manager-controller/0.log" Nov 27 12:06:46 crc kubenswrapper[4807]: I1127 12:06:46.085348 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-cbsmh_8285736f-5d32-4503-94dd-f3e7c5d6a8f0/cert-manager-cainjector/0.log" Nov 27 12:06:46 crc kubenswrapper[4807]: I1127 12:06:46.138844 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nwhxv_622a1ad1-bedf-4836-aa6e-0257f4694ae9/cert-manager-webhook/0.log" Nov 27 12:06:57 crc kubenswrapper[4807]: I1127 12:06:57.992024 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-rvf44_902faa80-4625-495f-a0a9-b94bc50eae67/nmstate-console-plugin/0.log" Nov 27 12:06:58 crc kubenswrapper[4807]: I1127 12:06:58.212690 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sm6qf_8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60/nmstate-handler/0.log" Nov 27 12:06:58 crc kubenswrapper[4807]: I1127 12:06:58.236505 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-rcrm6_18345e30-1bdc-47bf-8e02-16cf6c3f1bb1/kube-rbac-proxy/0.log" Nov 27 12:06:58 crc kubenswrapper[4807]: I1127 12:06:58.240080 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-rcrm6_18345e30-1bdc-47bf-8e02-16cf6c3f1bb1/nmstate-metrics/0.log" Nov 27 12:06:58 crc kubenswrapper[4807]: I1127 12:06:58.461983 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-6cwxp_f99de4c6-acb8-40aa-8c9f-c450de947993/nmstate-operator/0.log" Nov 27 12:06:58 crc kubenswrapper[4807]: I1127 12:06:58.504715 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-w7q6m_57fe27be-6097-4ef2-ac4a-2ff1625005a9/nmstate-webhook/0.log" Nov 27 12:07:12 crc kubenswrapper[4807]: I1127 12:07:12.410869 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-pkvz6_cfa209ab-1103-43a9-88e3-b7dd7048b2a6/kube-rbac-proxy/0.log" Nov 27 12:07:12 crc kubenswrapper[4807]: I1127 12:07:12.511098 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-pkvz6_cfa209ab-1103-43a9-88e3-b7dd7048b2a6/controller/0.log" Nov 27 12:07:12 crc kubenswrapper[4807]: I1127 12:07:12.687786 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-frr-files/0.log" Nov 27 12:07:12 crc kubenswrapper[4807]: I1127 12:07:12.823694 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-reloader/0.log" Nov 27 12:07:12 crc kubenswrapper[4807]: I1127 12:07:12.881946 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-frr-files/0.log" Nov 27 12:07:12 crc kubenswrapper[4807]: I1127 12:07:12.882007 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-metrics/0.log" Nov 27 12:07:12 crc kubenswrapper[4807]: I1127 12:07:12.934420 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-reloader/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.031692 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-reloader/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.043592 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-metrics/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.046577 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-frr-files/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.128796 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-metrics/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.266425 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-frr-files/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.290812 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-reloader/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.301892 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-metrics/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.327197 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/controller/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.492648 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/kube-rbac-proxy/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.497437 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/frr-metrics/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.503138 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/kube-rbac-proxy-frr/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.710987 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/reloader/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.809375 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-xksm8_43f0dfbf-ad37-403d-968c-852dec2e09a0/frr-k8s-webhook-server/0.log" Nov 27 12:07:13 crc kubenswrapper[4807]: I1127 12:07:13.986130 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7dd7bd5d6c-lvbpn_fd79b824-f426-4793-b5e4-b351642047f5/manager/0.log" Nov 27 12:07:14 crc kubenswrapper[4807]: I1127 12:07:14.158288 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-64bbfd4bf8-wlg8q_dfbac2a4-3c47-4ac8-8643-c322886121d4/webhook-server/0.log" Nov 27 12:07:14 crc kubenswrapper[4807]: I1127 12:07:14.276725 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-chfn8_6bdffc3f-68a4-4eb0-a4a7-725db327ea08/kube-rbac-proxy/0.log" Nov 27 12:07:14 crc kubenswrapper[4807]: I1127 12:07:14.781575 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-chfn8_6bdffc3f-68a4-4eb0-a4a7-725db327ea08/speaker/0.log" Nov 27 12:07:14 crc kubenswrapper[4807]: I1127 12:07:14.791697 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/frr/0.log" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.837234 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w8gg6"] Nov 27 12:07:24 crc kubenswrapper[4807]: E1127 12:07:24.838362 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2396e1-0743-49ed-a665-1aef889ee983" containerName="extract-content" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.838380 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2396e1-0743-49ed-a665-1aef889ee983" containerName="extract-content" Nov 27 12:07:24 crc kubenswrapper[4807]: E1127 12:07:24.838404 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2396e1-0743-49ed-a665-1aef889ee983" containerName="registry-server" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.838411 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2396e1-0743-49ed-a665-1aef889ee983" containerName="registry-server" Nov 27 12:07:24 crc kubenswrapper[4807]: E1127 12:07:24.838431 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2396e1-0743-49ed-a665-1aef889ee983" containerName="extract-utilities" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.838438 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2396e1-0743-49ed-a665-1aef889ee983" containerName="extract-utilities" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.838675 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2396e1-0743-49ed-a665-1aef889ee983" containerName="registry-server" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.840481 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.847908 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8gg6"] Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.849155 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-utilities\") pod \"community-operators-w8gg6\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.849287 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t268b\" (UniqueName: \"kubernetes.io/projected/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-kube-api-access-t268b\") pod \"community-operators-w8gg6\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.849352 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-catalog-content\") pod \"community-operators-w8gg6\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.951257 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-catalog-content\") pod \"community-operators-w8gg6\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.951403 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-utilities\") pod \"community-operators-w8gg6\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.951461 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t268b\" (UniqueName: \"kubernetes.io/projected/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-kube-api-access-t268b\") pod \"community-operators-w8gg6\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.951739 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-catalog-content\") pod \"community-operators-w8gg6\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.951898 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-utilities\") pod \"community-operators-w8gg6\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:24 crc kubenswrapper[4807]: I1127 12:07:24.980763 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t268b\" (UniqueName: \"kubernetes.io/projected/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-kube-api-access-t268b\") pod \"community-operators-w8gg6\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:25 crc kubenswrapper[4807]: I1127 12:07:25.214650 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:25 crc kubenswrapper[4807]: I1127 12:07:25.771657 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w8gg6"] Nov 27 12:07:25 crc kubenswrapper[4807]: I1127 12:07:25.998312 4807 generic.go:334] "Generic (PLEG): container finished" podID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerID="b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5" exitCode=0 Nov 27 12:07:25 crc kubenswrapper[4807]: I1127 12:07:25.998392 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8gg6" event={"ID":"4ac3c18c-22f9-42d8-bb00-6a5e148316ba","Type":"ContainerDied","Data":"b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5"} Nov 27 12:07:25 crc kubenswrapper[4807]: I1127 12:07:25.998642 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8gg6" event={"ID":"4ac3c18c-22f9-42d8-bb00-6a5e148316ba","Type":"ContainerStarted","Data":"50520c3f974069e7781cd95e6d977a076389f0c2c223a548a7c48da0bae79ebe"} Nov 27 12:07:26 crc kubenswrapper[4807]: I1127 12:07:26.677084 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/util/0.log" Nov 27 12:07:26 crc kubenswrapper[4807]: I1127 12:07:26.881206 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/util/0.log" Nov 27 12:07:26 crc kubenswrapper[4807]: I1127 12:07:26.888933 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/pull/0.log" Nov 27 12:07:26 crc kubenswrapper[4807]: I1127 12:07:26.935833 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/pull/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.058105 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/pull/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.062279 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/extract/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.065624 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/util/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.231379 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/util/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.379238 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/pull/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.409707 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/util/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.443756 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/pull/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.621113 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/util/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.650905 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/pull/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.664569 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/extract/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.814535 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-utilities/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.971039 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-content/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.974642 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-utilities/0.log" Nov 27 12:07:27 crc kubenswrapper[4807]: I1127 12:07:27.982102 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-content/0.log" Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.015551 4807 generic.go:334] "Generic (PLEG): container finished" podID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerID="a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd" exitCode=0 Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.015591 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8gg6" event={"ID":"4ac3c18c-22f9-42d8-bb00-6a5e148316ba","Type":"ContainerDied","Data":"a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd"} Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.150459 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-utilities/0.log" Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.187792 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-content/0.log" Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.369717 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w8gg6_4ac3c18c-22f9-42d8-bb00-6a5e148316ba/extract-utilities/0.log" Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.572195 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/registry-server/0.log" Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.647712 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w8gg6_4ac3c18c-22f9-42d8-bb00-6a5e148316ba/extract-utilities/0.log" Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.651131 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w8gg6_4ac3c18c-22f9-42d8-bb00-6a5e148316ba/extract-content/0.log" Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.655851 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w8gg6_4ac3c18c-22f9-42d8-bb00-6a5e148316ba/extract-content/0.log" Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.802990 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w8gg6_4ac3c18c-22f9-42d8-bb00-6a5e148316ba/extract-utilities/0.log" Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.827239 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-w8gg6_4ac3c18c-22f9-42d8-bb00-6a5e148316ba/extract-content/0.log" Nov 27 12:07:28 crc kubenswrapper[4807]: I1127 12:07:28.987781 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-utilities/0.log" Nov 27 12:07:29 crc kubenswrapper[4807]: I1127 12:07:29.025330 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8gg6" event={"ID":"4ac3c18c-22f9-42d8-bb00-6a5e148316ba","Type":"ContainerStarted","Data":"b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033"} Nov 27 12:07:29 crc kubenswrapper[4807]: I1127 12:07:29.049460 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w8gg6" podStartSLOduration=2.473674019 podStartE2EDuration="5.049443503s" podCreationTimestamp="2025-11-27 12:07:24 +0000 UTC" firstStartedPulling="2025-11-27 12:07:26.001356 +0000 UTC m=+3487.100854198" lastFinishedPulling="2025-11-27 12:07:28.577125484 +0000 UTC m=+3489.676623682" observedRunningTime="2025-11-27 12:07:29.041464621 +0000 UTC m=+3490.140962829" watchObservedRunningTime="2025-11-27 12:07:29.049443503 +0000 UTC m=+3490.148941701" Nov 27 12:07:29 crc kubenswrapper[4807]: I1127 12:07:29.207139 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-content/0.log" Nov 27 12:07:29 crc kubenswrapper[4807]: I1127 12:07:29.255808 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-content/0.log" Nov 27 12:07:29 crc kubenswrapper[4807]: I1127 12:07:29.278945 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-utilities/0.log" Nov 27 12:07:29 crc kubenswrapper[4807]: I1127 12:07:29.507897 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-utilities/0.log" Nov 27 12:07:29 crc kubenswrapper[4807]: I1127 12:07:29.570956 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-content/0.log" Nov 27 12:07:29 crc kubenswrapper[4807]: I1127 12:07:29.802275 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/2.log" Nov 27 12:07:29 crc kubenswrapper[4807]: I1127 12:07:29.850854 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/3.log" Nov 27 12:07:29 crc kubenswrapper[4807]: I1127 12:07:29.875483 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/registry-server/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.046932 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-utilities/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.233371 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-content/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.233659 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-utilities/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.254870 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-content/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.460014 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-content/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.492191 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-utilities/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.514333 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/registry-server/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.673597 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-utilities/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.870815 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-content/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.879115 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-utilities/0.log" Nov 27 12:07:30 crc kubenswrapper[4807]: I1127 12:07:30.881796 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-content/0.log" Nov 27 12:07:31 crc kubenswrapper[4807]: I1127 12:07:31.070452 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-content/0.log" Nov 27 12:07:31 crc kubenswrapper[4807]: I1127 12:07:31.074382 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-utilities/0.log" Nov 27 12:07:31 crc kubenswrapper[4807]: I1127 12:07:31.921284 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/registry-server/0.log" Nov 27 12:07:35 crc kubenswrapper[4807]: I1127 12:07:35.215769 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:35 crc kubenswrapper[4807]: I1127 12:07:35.216539 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:35 crc kubenswrapper[4807]: I1127 12:07:35.285808 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:36 crc kubenswrapper[4807]: I1127 12:07:36.134115 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:36 crc kubenswrapper[4807]: I1127 12:07:36.184322 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8gg6"] Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.110723 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w8gg6" podUID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerName="registry-server" containerID="cri-o://b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033" gracePeriod=2 Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.595572 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.704962 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-utilities\") pod \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.705155 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t268b\" (UniqueName: \"kubernetes.io/projected/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-kube-api-access-t268b\") pod \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.705370 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-catalog-content\") pod \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\" (UID: \"4ac3c18c-22f9-42d8-bb00-6a5e148316ba\") " Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.708672 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-utilities" (OuterVolumeSpecName: "utilities") pod "4ac3c18c-22f9-42d8-bb00-6a5e148316ba" (UID: "4ac3c18c-22f9-42d8-bb00-6a5e148316ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.725351 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-kube-api-access-t268b" (OuterVolumeSpecName: "kube-api-access-t268b") pod "4ac3c18c-22f9-42d8-bb00-6a5e148316ba" (UID: "4ac3c18c-22f9-42d8-bb00-6a5e148316ba"). InnerVolumeSpecName "kube-api-access-t268b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.778860 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ac3c18c-22f9-42d8-bb00-6a5e148316ba" (UID: "4ac3c18c-22f9-42d8-bb00-6a5e148316ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.807598 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.807634 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 12:07:38 crc kubenswrapper[4807]: I1127 12:07:38.807646 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t268b\" (UniqueName: \"kubernetes.io/projected/4ac3c18c-22f9-42d8-bb00-6a5e148316ba-kube-api-access-t268b\") on node \"crc\" DevicePath \"\"" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.122602 4807 generic.go:334] "Generic (PLEG): container finished" podID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerID="b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033" exitCode=0 Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.122658 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8gg6" event={"ID":"4ac3c18c-22f9-42d8-bb00-6a5e148316ba","Type":"ContainerDied","Data":"b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033"} Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.122693 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w8gg6" event={"ID":"4ac3c18c-22f9-42d8-bb00-6a5e148316ba","Type":"ContainerDied","Data":"50520c3f974069e7781cd95e6d977a076389f0c2c223a548a7c48da0bae79ebe"} Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.122717 4807 scope.go:117] "RemoveContainer" containerID="b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.122716 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w8gg6" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.146728 4807 scope.go:117] "RemoveContainer" containerID="a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.174167 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w8gg6"] Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.177386 4807 scope.go:117] "RemoveContainer" containerID="b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.184417 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w8gg6"] Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.230679 4807 scope.go:117] "RemoveContainer" containerID="b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033" Nov 27 12:07:39 crc kubenswrapper[4807]: E1127 12:07:39.231229 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033\": container with ID starting with b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033 not found: ID does not exist" containerID="b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.231335 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033"} err="failed to get container status \"b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033\": rpc error: code = NotFound desc = could not find container \"b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033\": container with ID starting with b1747f884168b0b80002eaa60093386fe1144f579ddf63911f99593473e61033 not found: ID does not exist" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.231365 4807 scope.go:117] "RemoveContainer" containerID="a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd" Nov 27 12:07:39 crc kubenswrapper[4807]: E1127 12:07:39.231815 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd\": container with ID starting with a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd not found: ID does not exist" containerID="a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.231879 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd"} err="failed to get container status \"a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd\": rpc error: code = NotFound desc = could not find container \"a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd\": container with ID starting with a725c05e74d18639089ded6d05f60ff9a0c24d7bf85d1089493435382b10e1dd not found: ID does not exist" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.231922 4807 scope.go:117] "RemoveContainer" containerID="b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5" Nov 27 12:07:39 crc kubenswrapper[4807]: E1127 12:07:39.232560 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5\": container with ID starting with b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5 not found: ID does not exist" containerID="b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.232608 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5"} err="failed to get container status \"b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5\": rpc error: code = NotFound desc = could not find container \"b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5\": container with ID starting with b7ae0c1cab9b8a0ba3a2e8289fe4027a4a1817c51ee6c4edc739ee21a1519ef5 not found: ID does not exist" Nov 27 12:07:39 crc kubenswrapper[4807]: I1127 12:07:39.556953 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" path="/var/lib/kubelet/pods/4ac3c18c-22f9-42d8-bb00-6a5e148316ba/volumes" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.331509 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7rm"] Nov 27 12:08:25 crc kubenswrapper[4807]: E1127 12:08:25.332493 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerName="registry-server" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.332508 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerName="registry-server" Nov 27 12:08:25 crc kubenswrapper[4807]: E1127 12:08:25.332545 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerName="extract-utilities" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.332552 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerName="extract-utilities" Nov 27 12:08:25 crc kubenswrapper[4807]: E1127 12:08:25.332563 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerName="extract-content" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.332569 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerName="extract-content" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.332744 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac3c18c-22f9-42d8-bb00-6a5e148316ba" containerName="registry-server" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.335671 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.347598 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7rm"] Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.481325 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d4gt\" (UniqueName: \"kubernetes.io/projected/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-kube-api-access-4d4gt\") pod \"redhat-marketplace-nh7rm\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.481703 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-utilities\") pod \"redhat-marketplace-nh7rm\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.481786 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-catalog-content\") pod \"redhat-marketplace-nh7rm\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.583156 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-utilities\") pod \"redhat-marketplace-nh7rm\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.583294 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-catalog-content\") pod \"redhat-marketplace-nh7rm\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.583419 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d4gt\" (UniqueName: \"kubernetes.io/projected/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-kube-api-access-4d4gt\") pod \"redhat-marketplace-nh7rm\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.583609 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-utilities\") pod \"redhat-marketplace-nh7rm\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.583900 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-catalog-content\") pod \"redhat-marketplace-nh7rm\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.603389 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d4gt\" (UniqueName: \"kubernetes.io/projected/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-kube-api-access-4d4gt\") pod \"redhat-marketplace-nh7rm\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:25 crc kubenswrapper[4807]: I1127 12:08:25.691426 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:26 crc kubenswrapper[4807]: I1127 12:08:26.138048 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7rm"] Nov 27 12:08:26 crc kubenswrapper[4807]: W1127 12:08:26.144155 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dd90c55_f5dc_4666_b6e4_f5f081ea192e.slice/crio-ad97df37eae9c45a26de0b360ff93d141735f93d33a7be5955d8154d2c5c561f WatchSource:0}: Error finding container ad97df37eae9c45a26de0b360ff93d141735f93d33a7be5955d8154d2c5c561f: Status 404 returned error can't find the container with id ad97df37eae9c45a26de0b360ff93d141735f93d33a7be5955d8154d2c5c561f Nov 27 12:08:26 crc kubenswrapper[4807]: I1127 12:08:26.575272 4807 generic.go:334] "Generic (PLEG): container finished" podID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerID="63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682" exitCode=0 Nov 27 12:08:26 crc kubenswrapper[4807]: I1127 12:08:26.575325 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7rm" event={"ID":"9dd90c55-f5dc-4666-b6e4-f5f081ea192e","Type":"ContainerDied","Data":"63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682"} Nov 27 12:08:26 crc kubenswrapper[4807]: I1127 12:08:26.575372 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7rm" event={"ID":"9dd90c55-f5dc-4666-b6e4-f5f081ea192e","Type":"ContainerStarted","Data":"ad97df37eae9c45a26de0b360ff93d141735f93d33a7be5955d8154d2c5c561f"} Nov 27 12:08:27 crc kubenswrapper[4807]: I1127 12:08:27.586493 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7rm" event={"ID":"9dd90c55-f5dc-4666-b6e4-f5f081ea192e","Type":"ContainerStarted","Data":"657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90"} Nov 27 12:08:28 crc kubenswrapper[4807]: I1127 12:08:28.600069 4807 generic.go:334] "Generic (PLEG): container finished" podID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerID="657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90" exitCode=0 Nov 27 12:08:28 crc kubenswrapper[4807]: I1127 12:08:28.600425 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7rm" event={"ID":"9dd90c55-f5dc-4666-b6e4-f5f081ea192e","Type":"ContainerDied","Data":"657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90"} Nov 27 12:08:30 crc kubenswrapper[4807]: I1127 12:08:30.622410 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7rm" event={"ID":"9dd90c55-f5dc-4666-b6e4-f5f081ea192e","Type":"ContainerStarted","Data":"001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0"} Nov 27 12:08:30 crc kubenswrapper[4807]: I1127 12:08:30.639841 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nh7rm" podStartSLOduration=2.8130898330000003 podStartE2EDuration="5.639808446s" podCreationTimestamp="2025-11-27 12:08:25 +0000 UTC" firstStartedPulling="2025-11-27 12:08:26.593362644 +0000 UTC m=+3547.692860842" lastFinishedPulling="2025-11-27 12:08:29.420081247 +0000 UTC m=+3550.519579455" observedRunningTime="2025-11-27 12:08:30.638295036 +0000 UTC m=+3551.737793244" watchObservedRunningTime="2025-11-27 12:08:30.639808446 +0000 UTC m=+3551.739306644" Nov 27 12:08:35 crc kubenswrapper[4807]: I1127 12:08:35.691547 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:35 crc kubenswrapper[4807]: I1127 12:08:35.692166 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:35 crc kubenswrapper[4807]: I1127 12:08:35.765689 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:36 crc kubenswrapper[4807]: I1127 12:08:36.714428 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:36 crc kubenswrapper[4807]: I1127 12:08:36.776457 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7rm"] Nov 27 12:08:38 crc kubenswrapper[4807]: I1127 12:08:38.698387 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nh7rm" podUID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerName="registry-server" containerID="cri-o://001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0" gracePeriod=2 Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.234032 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.354088 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4gt\" (UniqueName: \"kubernetes.io/projected/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-kube-api-access-4d4gt\") pod \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.354455 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-catalog-content\") pod \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.354514 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-utilities\") pod \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\" (UID: \"9dd90c55-f5dc-4666-b6e4-f5f081ea192e\") " Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.358082 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-utilities" (OuterVolumeSpecName: "utilities") pod "9dd90c55-f5dc-4666-b6e4-f5f081ea192e" (UID: "9dd90c55-f5dc-4666-b6e4-f5f081ea192e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.359806 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-kube-api-access-4d4gt" (OuterVolumeSpecName: "kube-api-access-4d4gt") pod "9dd90c55-f5dc-4666-b6e4-f5f081ea192e" (UID: "9dd90c55-f5dc-4666-b6e4-f5f081ea192e"). InnerVolumeSpecName "kube-api-access-4d4gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.382071 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9dd90c55-f5dc-4666-b6e4-f5f081ea192e" (UID: "9dd90c55-f5dc-4666-b6e4-f5f081ea192e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.461075 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.461109 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4gt\" (UniqueName: \"kubernetes.io/projected/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-kube-api-access-4d4gt\") on node \"crc\" DevicePath \"\"" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.461120 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dd90c55-f5dc-4666-b6e4-f5f081ea192e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.712285 4807 generic.go:334] "Generic (PLEG): container finished" podID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerID="001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0" exitCode=0 Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.712330 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7rm" event={"ID":"9dd90c55-f5dc-4666-b6e4-f5f081ea192e","Type":"ContainerDied","Data":"001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0"} Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.712369 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nh7rm" event={"ID":"9dd90c55-f5dc-4666-b6e4-f5f081ea192e","Type":"ContainerDied","Data":"ad97df37eae9c45a26de0b360ff93d141735f93d33a7be5955d8154d2c5c561f"} Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.712389 4807 scope.go:117] "RemoveContainer" containerID="001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.712401 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nh7rm" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.736957 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7rm"] Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.738948 4807 scope.go:117] "RemoveContainer" containerID="657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.748444 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nh7rm"] Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.759736 4807 scope.go:117] "RemoveContainer" containerID="63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.803059 4807 scope.go:117] "RemoveContainer" containerID="001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0" Nov 27 12:08:39 crc kubenswrapper[4807]: E1127 12:08:39.803580 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0\": container with ID starting with 001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0 not found: ID does not exist" containerID="001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.803620 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0"} err="failed to get container status \"001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0\": rpc error: code = NotFound desc = could not find container \"001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0\": container with ID starting with 001afbc84e79d075f6fe23872d983045d05d2bf033b601b805fb863a91a1e3b0 not found: ID does not exist" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.803645 4807 scope.go:117] "RemoveContainer" containerID="657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90" Nov 27 12:08:39 crc kubenswrapper[4807]: E1127 12:08:39.804003 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90\": container with ID starting with 657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90 not found: ID does not exist" containerID="657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.804025 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90"} err="failed to get container status \"657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90\": rpc error: code = NotFound desc = could not find container \"657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90\": container with ID starting with 657c87e5cde5a475a1dd81b34384975319fd847227f3c905dd71b2921a286e90 not found: ID does not exist" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.804036 4807 scope.go:117] "RemoveContainer" containerID="63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682" Nov 27 12:08:39 crc kubenswrapper[4807]: E1127 12:08:39.804345 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682\": container with ID starting with 63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682 not found: ID does not exist" containerID="63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682" Nov 27 12:08:39 crc kubenswrapper[4807]: I1127 12:08:39.804374 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682"} err="failed to get container status \"63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682\": rpc error: code = NotFound desc = could not find container \"63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682\": container with ID starting with 63d83432886f0898e13980d9b6108d659417ab87be25173052a26ba8e5d01682 not found: ID does not exist" Nov 27 12:08:41 crc kubenswrapper[4807]: I1127 12:08:41.554792 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" path="/var/lib/kubelet/pods/9dd90c55-f5dc-4666-b6e4-f5f081ea192e/volumes" Nov 27 12:08:50 crc kubenswrapper[4807]: I1127 12:08:50.921293 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 12:08:50 crc kubenswrapper[4807]: I1127 12:08:50.921932 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.347043 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-clnd9"] Nov 27 12:08:54 crc kubenswrapper[4807]: E1127 12:08:54.347842 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerName="registry-server" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.347854 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerName="registry-server" Nov 27 12:08:54 crc kubenswrapper[4807]: E1127 12:08:54.347885 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerName="extract-utilities" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.347891 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerName="extract-utilities" Nov 27 12:08:54 crc kubenswrapper[4807]: E1127 12:08:54.347901 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerName="extract-content" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.347907 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerName="extract-content" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.348086 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dd90c55-f5dc-4666-b6e4-f5f081ea192e" containerName="registry-server" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.349486 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.368090 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-clnd9"] Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.542971 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dc9\" (UniqueName: \"kubernetes.io/projected/6791a75c-fb0a-468e-b80b-b339afb8c9ac-kube-api-access-w8dc9\") pod \"certified-operators-clnd9\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.543031 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-catalog-content\") pod \"certified-operators-clnd9\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.543115 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-utilities\") pod \"certified-operators-clnd9\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.644407 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dc9\" (UniqueName: \"kubernetes.io/projected/6791a75c-fb0a-468e-b80b-b339afb8c9ac-kube-api-access-w8dc9\") pod \"certified-operators-clnd9\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.644522 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-catalog-content\") pod \"certified-operators-clnd9\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.644778 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-utilities\") pod \"certified-operators-clnd9\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.647420 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-utilities\") pod \"certified-operators-clnd9\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.647445 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-catalog-content\") pod \"certified-operators-clnd9\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.677069 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dc9\" (UniqueName: \"kubernetes.io/projected/6791a75c-fb0a-468e-b80b-b339afb8c9ac-kube-api-access-w8dc9\") pod \"certified-operators-clnd9\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:54 crc kubenswrapper[4807]: I1127 12:08:54.968142 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:08:55 crc kubenswrapper[4807]: I1127 12:08:55.460086 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-clnd9"] Nov 27 12:08:55 crc kubenswrapper[4807]: W1127 12:08:55.480635 4807 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6791a75c_fb0a_468e_b80b_b339afb8c9ac.slice/crio-2c1474fa5fa0b82208d2288ccb559dfcc2b367c1aa67a7ff15e9fd50e52bddf9 WatchSource:0}: Error finding container 2c1474fa5fa0b82208d2288ccb559dfcc2b367c1aa67a7ff15e9fd50e52bddf9: Status 404 returned error can't find the container with id 2c1474fa5fa0b82208d2288ccb559dfcc2b367c1aa67a7ff15e9fd50e52bddf9 Nov 27 12:08:55 crc kubenswrapper[4807]: I1127 12:08:55.891201 4807 generic.go:334] "Generic (PLEG): container finished" podID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerID="52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a" exitCode=0 Nov 27 12:08:55 crc kubenswrapper[4807]: I1127 12:08:55.891300 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clnd9" event={"ID":"6791a75c-fb0a-468e-b80b-b339afb8c9ac","Type":"ContainerDied","Data":"52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a"} Nov 27 12:08:55 crc kubenswrapper[4807]: I1127 12:08:55.891555 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clnd9" event={"ID":"6791a75c-fb0a-468e-b80b-b339afb8c9ac","Type":"ContainerStarted","Data":"2c1474fa5fa0b82208d2288ccb559dfcc2b367c1aa67a7ff15e9fd50e52bddf9"} Nov 27 12:08:55 crc kubenswrapper[4807]: I1127 12:08:55.893442 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 12:08:56 crc kubenswrapper[4807]: I1127 12:08:56.900646 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clnd9" event={"ID":"6791a75c-fb0a-468e-b80b-b339afb8c9ac","Type":"ContainerStarted","Data":"2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562"} Nov 27 12:08:57 crc kubenswrapper[4807]: I1127 12:08:57.914425 4807 generic.go:334] "Generic (PLEG): container finished" podID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerID="2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562" exitCode=0 Nov 27 12:08:57 crc kubenswrapper[4807]: I1127 12:08:57.914596 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clnd9" event={"ID":"6791a75c-fb0a-468e-b80b-b339afb8c9ac","Type":"ContainerDied","Data":"2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562"} Nov 27 12:08:58 crc kubenswrapper[4807]: I1127 12:08:58.924473 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clnd9" event={"ID":"6791a75c-fb0a-468e-b80b-b339afb8c9ac","Type":"ContainerStarted","Data":"a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e"} Nov 27 12:09:04 crc kubenswrapper[4807]: I1127 12:09:04.968734 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:09:04 crc kubenswrapper[4807]: I1127 12:09:04.970511 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:09:05 crc kubenswrapper[4807]: I1127 12:09:05.029716 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:09:05 crc kubenswrapper[4807]: I1127 12:09:05.067010 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-clnd9" podStartSLOduration=8.592980872 podStartE2EDuration="11.066980488s" podCreationTimestamp="2025-11-27 12:08:54 +0000 UTC" firstStartedPulling="2025-11-27 12:08:55.892899394 +0000 UTC m=+3576.992397612" lastFinishedPulling="2025-11-27 12:08:58.36689904 +0000 UTC m=+3579.466397228" observedRunningTime="2025-11-27 12:08:58.94670792 +0000 UTC m=+3580.046206138" watchObservedRunningTime="2025-11-27 12:09:05.066980488 +0000 UTC m=+3586.166478716" Nov 27 12:09:06 crc kubenswrapper[4807]: I1127 12:09:06.049087 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:09:06 crc kubenswrapper[4807]: I1127 12:09:06.114422 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-clnd9"] Nov 27 12:09:07 crc kubenswrapper[4807]: I1127 12:09:07.013619 4807 generic.go:334] "Generic (PLEG): container finished" podID="ca489b67-7355-4cf9-a5f8-8fd359f37d63" containerID="b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8" exitCode=0 Nov 27 12:09:07 crc kubenswrapper[4807]: I1127 12:09:07.013698 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kg6jg/must-gather-k626h" event={"ID":"ca489b67-7355-4cf9-a5f8-8fd359f37d63","Type":"ContainerDied","Data":"b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8"} Nov 27 12:09:07 crc kubenswrapper[4807]: I1127 12:09:07.015045 4807 scope.go:117] "RemoveContainer" containerID="b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8" Nov 27 12:09:07 crc kubenswrapper[4807]: I1127 12:09:07.722710 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kg6jg_must-gather-k626h_ca489b67-7355-4cf9-a5f8-8fd359f37d63/gather/0.log" Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.022910 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-clnd9" podUID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerName="registry-server" containerID="cri-o://a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e" gracePeriod=2 Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.490300 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.641093 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-utilities\") pod \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.641174 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8dc9\" (UniqueName: \"kubernetes.io/projected/6791a75c-fb0a-468e-b80b-b339afb8c9ac-kube-api-access-w8dc9\") pod \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.641315 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-catalog-content\") pod \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\" (UID: \"6791a75c-fb0a-468e-b80b-b339afb8c9ac\") " Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.641981 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-utilities" (OuterVolumeSpecName: "utilities") pod "6791a75c-fb0a-468e-b80b-b339afb8c9ac" (UID: "6791a75c-fb0a-468e-b80b-b339afb8c9ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.643380 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.647086 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6791a75c-fb0a-468e-b80b-b339afb8c9ac-kube-api-access-w8dc9" (OuterVolumeSpecName: "kube-api-access-w8dc9") pod "6791a75c-fb0a-468e-b80b-b339afb8c9ac" (UID: "6791a75c-fb0a-468e-b80b-b339afb8c9ac"). InnerVolumeSpecName "kube-api-access-w8dc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.697114 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6791a75c-fb0a-468e-b80b-b339afb8c9ac" (UID: "6791a75c-fb0a-468e-b80b-b339afb8c9ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.745480 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8dc9\" (UniqueName: \"kubernetes.io/projected/6791a75c-fb0a-468e-b80b-b339afb8c9ac-kube-api-access-w8dc9\") on node \"crc\" DevicePath \"\"" Nov 27 12:09:08 crc kubenswrapper[4807]: I1127 12:09:08.746419 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6791a75c-fb0a-468e-b80b-b339afb8c9ac-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.032869 4807 generic.go:334] "Generic (PLEG): container finished" podID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerID="a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e" exitCode=0 Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.032919 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clnd9" event={"ID":"6791a75c-fb0a-468e-b80b-b339afb8c9ac","Type":"ContainerDied","Data":"a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e"} Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.032950 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-clnd9" event={"ID":"6791a75c-fb0a-468e-b80b-b339afb8c9ac","Type":"ContainerDied","Data":"2c1474fa5fa0b82208d2288ccb559dfcc2b367c1aa67a7ff15e9fd50e52bddf9"} Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.032969 4807 scope.go:117] "RemoveContainer" containerID="a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.033186 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-clnd9" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.058465 4807 scope.go:117] "RemoveContainer" containerID="2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.083984 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-clnd9"] Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.090984 4807 scope.go:117] "RemoveContainer" containerID="52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.093905 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-clnd9"] Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.155781 4807 scope.go:117] "RemoveContainer" containerID="a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e" Nov 27 12:09:09 crc kubenswrapper[4807]: E1127 12:09:09.156146 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e\": container with ID starting with a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e not found: ID does not exist" containerID="a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.156176 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e"} err="failed to get container status \"a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e\": rpc error: code = NotFound desc = could not find container \"a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e\": container with ID starting with a1042445db25660300edb87629682444089b3277d486e6fbc498dc8f8069170e not found: ID does not exist" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.156196 4807 scope.go:117] "RemoveContainer" containerID="2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562" Nov 27 12:09:09 crc kubenswrapper[4807]: E1127 12:09:09.156739 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562\": container with ID starting with 2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562 not found: ID does not exist" containerID="2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.156760 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562"} err="failed to get container status \"2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562\": rpc error: code = NotFound desc = could not find container \"2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562\": container with ID starting with 2707d4181cf03162783c2493e0b3e2bf42b40702d3e8c75ec20caceb826dc562 not found: ID does not exist" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.156773 4807 scope.go:117] "RemoveContainer" containerID="52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a" Nov 27 12:09:09 crc kubenswrapper[4807]: E1127 12:09:09.157007 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a\": container with ID starting with 52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a not found: ID does not exist" containerID="52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.157039 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a"} err="failed to get container status \"52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a\": rpc error: code = NotFound desc = could not find container \"52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a\": container with ID starting with 52a103ab329c62b6d5f758f22df457b81e292aaf47dfea6eaffcd711b4ae769a not found: ID does not exist" Nov 27 12:09:09 crc kubenswrapper[4807]: I1127 12:09:09.543698 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" path="/var/lib/kubelet/pods/6791a75c-fb0a-468e-b80b-b339afb8c9ac/volumes" Nov 27 12:09:15 crc kubenswrapper[4807]: I1127 12:09:15.495517 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kg6jg/must-gather-k626h"] Nov 27 12:09:15 crc kubenswrapper[4807]: I1127 12:09:15.496398 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kg6jg/must-gather-k626h" podUID="ca489b67-7355-4cf9-a5f8-8fd359f37d63" containerName="copy" containerID="cri-o://ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e" gracePeriod=2 Nov 27 12:09:15 crc kubenswrapper[4807]: I1127 12:09:15.506928 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kg6jg/must-gather-k626h"] Nov 27 12:09:15 crc kubenswrapper[4807]: I1127 12:09:15.925695 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kg6jg_must-gather-k626h_ca489b67-7355-4cf9-a5f8-8fd359f37d63/copy/0.log" Nov 27 12:09:15 crc kubenswrapper[4807]: I1127 12:09:15.926742 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/must-gather-k626h" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.074731 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca489b67-7355-4cf9-a5f8-8fd359f37d63-must-gather-output\") pod \"ca489b67-7355-4cf9-a5f8-8fd359f37d63\" (UID: \"ca489b67-7355-4cf9-a5f8-8fd359f37d63\") " Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.074890 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxmv6\" (UniqueName: \"kubernetes.io/projected/ca489b67-7355-4cf9-a5f8-8fd359f37d63-kube-api-access-sxmv6\") pod \"ca489b67-7355-4cf9-a5f8-8fd359f37d63\" (UID: \"ca489b67-7355-4cf9-a5f8-8fd359f37d63\") " Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.081241 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca489b67-7355-4cf9-a5f8-8fd359f37d63-kube-api-access-sxmv6" (OuterVolumeSpecName: "kube-api-access-sxmv6") pod "ca489b67-7355-4cf9-a5f8-8fd359f37d63" (UID: "ca489b67-7355-4cf9-a5f8-8fd359f37d63"). InnerVolumeSpecName "kube-api-access-sxmv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.097304 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kg6jg_must-gather-k626h_ca489b67-7355-4cf9-a5f8-8fd359f37d63/copy/0.log" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.097801 4807 generic.go:334] "Generic (PLEG): container finished" podID="ca489b67-7355-4cf9-a5f8-8fd359f37d63" containerID="ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e" exitCode=143 Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.097862 4807 scope.go:117] "RemoveContainer" containerID="ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.097922 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kg6jg/must-gather-k626h" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.138442 4807 scope.go:117] "RemoveContainer" containerID="b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.176535 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxmv6\" (UniqueName: \"kubernetes.io/projected/ca489b67-7355-4cf9-a5f8-8fd359f37d63-kube-api-access-sxmv6\") on node \"crc\" DevicePath \"\"" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.224572 4807 scope.go:117] "RemoveContainer" containerID="ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e" Nov 27 12:09:16 crc kubenswrapper[4807]: E1127 12:09:16.224968 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e\": container with ID starting with ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e not found: ID does not exist" containerID="ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.225005 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e"} err="failed to get container status \"ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e\": rpc error: code = NotFound desc = could not find container \"ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e\": container with ID starting with ec6bf21ef39757de17f6205111e1c00b96dfb8ad9889f98794f319f431404c7e not found: ID does not exist" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.225039 4807 scope.go:117] "RemoveContainer" containerID="b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8" Nov 27 12:09:16 crc kubenswrapper[4807]: E1127 12:09:16.225394 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8\": container with ID starting with b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8 not found: ID does not exist" containerID="b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.225432 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8"} err="failed to get container status \"b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8\": rpc error: code = NotFound desc = could not find container \"b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8\": container with ID starting with b21e5d0fdb37f8175263d8b63106791709eaa244aa80616b13353856b2d346b8 not found: ID does not exist" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.233459 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca489b67-7355-4cf9-a5f8-8fd359f37d63-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ca489b67-7355-4cf9-a5f8-8fd359f37d63" (UID: "ca489b67-7355-4cf9-a5f8-8fd359f37d63"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:09:16 crc kubenswrapper[4807]: I1127 12:09:16.278506 4807 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca489b67-7355-4cf9-a5f8-8fd359f37d63-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 27 12:09:17 crc kubenswrapper[4807]: I1127 12:09:17.555007 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca489b67-7355-4cf9-a5f8-8fd359f37d63" path="/var/lib/kubelet/pods/ca489b67-7355-4cf9-a5f8-8fd359f37d63/volumes" Nov 27 12:09:20 crc kubenswrapper[4807]: I1127 12:09:20.921916 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 12:09:20 crc kubenswrapper[4807]: I1127 12:09:20.922475 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 12:09:42 crc kubenswrapper[4807]: I1127 12:09:42.063770 4807 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-7b6fc97755-xnlzr" podUID="ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Nov 27 12:09:50 crc kubenswrapper[4807]: I1127 12:09:50.921548 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 12:09:50 crc kubenswrapper[4807]: I1127 12:09:50.922132 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 12:09:50 crc kubenswrapper[4807]: I1127 12:09:50.922177 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 12:09:50 crc kubenswrapper[4807]: I1127 12:09:50.922966 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6ad55be444bb836094e989710fe293e7e4b3c2a590dc0117eb67fcd7f9ef509"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 12:09:50 crc kubenswrapper[4807]: I1127 12:09:50.923018 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://e6ad55be444bb836094e989710fe293e7e4b3c2a590dc0117eb67fcd7f9ef509" gracePeriod=600 Nov 27 12:09:51 crc kubenswrapper[4807]: I1127 12:09:51.404011 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="e6ad55be444bb836094e989710fe293e7e4b3c2a590dc0117eb67fcd7f9ef509" exitCode=0 Nov 27 12:09:51 crc kubenswrapper[4807]: I1127 12:09:51.404068 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"e6ad55be444bb836094e989710fe293e7e4b3c2a590dc0117eb67fcd7f9ef509"} Nov 27 12:09:51 crc kubenswrapper[4807]: I1127 12:09:51.404406 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91"} Nov 27 12:09:51 crc kubenswrapper[4807]: I1127 12:09:51.404433 4807 scope.go:117] "RemoveContainer" containerID="34658de4f9bc440cabb7f026c8df6d6b099cb4f74baf3a54c50a78de6ef045c6" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.640999 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f58j6/must-gather-6bxh6"] Nov 27 12:11:45 crc kubenswrapper[4807]: E1127 12:11:45.641946 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerName="extract-utilities" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.641959 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerName="extract-utilities" Nov 27 12:11:45 crc kubenswrapper[4807]: E1127 12:11:45.641975 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca489b67-7355-4cf9-a5f8-8fd359f37d63" containerName="gather" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.641983 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca489b67-7355-4cf9-a5f8-8fd359f37d63" containerName="gather" Nov 27 12:11:45 crc kubenswrapper[4807]: E1127 12:11:45.641991 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerName="extract-content" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.641997 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerName="extract-content" Nov 27 12:11:45 crc kubenswrapper[4807]: E1127 12:11:45.642012 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca489b67-7355-4cf9-a5f8-8fd359f37d63" containerName="copy" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.642018 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca489b67-7355-4cf9-a5f8-8fd359f37d63" containerName="copy" Nov 27 12:11:45 crc kubenswrapper[4807]: E1127 12:11:45.642027 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerName="registry-server" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.642033 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerName="registry-server" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.643420 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca489b67-7355-4cf9-a5f8-8fd359f37d63" containerName="gather" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.643450 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="6791a75c-fb0a-468e-b80b-b339afb8c9ac" containerName="registry-server" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.643479 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca489b67-7355-4cf9-a5f8-8fd359f37d63" containerName="copy" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.645575 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/must-gather-6bxh6" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.650734 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f58j6/must-gather-6bxh6"] Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.655145 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-f58j6"/"default-dockercfg-4khtz" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.655239 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f58j6"/"kube-root-ca.crt" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.658517 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-f58j6"/"openshift-service-ca.crt" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.670936 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-must-gather-output\") pod \"must-gather-6bxh6\" (UID: \"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7\") " pod="openshift-must-gather-f58j6/must-gather-6bxh6" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.671062 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x68h6\" (UniqueName: \"kubernetes.io/projected/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-kube-api-access-x68h6\") pod \"must-gather-6bxh6\" (UID: \"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7\") " pod="openshift-must-gather-f58j6/must-gather-6bxh6" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.772817 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x68h6\" (UniqueName: \"kubernetes.io/projected/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-kube-api-access-x68h6\") pod \"must-gather-6bxh6\" (UID: \"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7\") " pod="openshift-must-gather-f58j6/must-gather-6bxh6" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.772921 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-must-gather-output\") pod \"must-gather-6bxh6\" (UID: \"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7\") " pod="openshift-must-gather-f58j6/must-gather-6bxh6" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.773294 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-must-gather-output\") pod \"must-gather-6bxh6\" (UID: \"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7\") " pod="openshift-must-gather-f58j6/must-gather-6bxh6" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.789654 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x68h6\" (UniqueName: \"kubernetes.io/projected/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-kube-api-access-x68h6\") pod \"must-gather-6bxh6\" (UID: \"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7\") " pod="openshift-must-gather-f58j6/must-gather-6bxh6" Nov 27 12:11:45 crc kubenswrapper[4807]: I1127 12:11:45.974617 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/must-gather-6bxh6" Nov 27 12:11:46 crc kubenswrapper[4807]: I1127 12:11:46.486565 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f58j6/must-gather-6bxh6"] Nov 27 12:11:46 crc kubenswrapper[4807]: I1127 12:11:46.574493 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/must-gather-6bxh6" event={"ID":"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7","Type":"ContainerStarted","Data":"d3172e5b6415aba52fb957c7267e8c433761a09127346ca5996275f4053b77c6"} Nov 27 12:11:47 crc kubenswrapper[4807]: I1127 12:11:47.599934 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/must-gather-6bxh6" event={"ID":"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7","Type":"ContainerStarted","Data":"63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f"} Nov 27 12:11:47 crc kubenswrapper[4807]: I1127 12:11:47.600025 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/must-gather-6bxh6" event={"ID":"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7","Type":"ContainerStarted","Data":"dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33"} Nov 27 12:11:47 crc kubenswrapper[4807]: I1127 12:11:47.648427 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f58j6/must-gather-6bxh6" podStartSLOduration=2.64841173 podStartE2EDuration="2.64841173s" podCreationTimestamp="2025-11-27 12:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 12:11:47.6349046 +0000 UTC m=+3748.734402808" watchObservedRunningTime="2025-11-27 12:11:47.64841173 +0000 UTC m=+3748.747909928" Nov 27 12:11:50 crc kubenswrapper[4807]: I1127 12:11:50.165701 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f58j6/crc-debug-5pvws"] Nov 27 12:11:50 crc kubenswrapper[4807]: I1127 12:11:50.167558 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-5pvws" Nov 27 12:11:50 crc kubenswrapper[4807]: I1127 12:11:50.358089 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqp89\" (UniqueName: \"kubernetes.io/projected/78a942e9-1895-4fd8-8428-d313e42deeef-kube-api-access-zqp89\") pod \"crc-debug-5pvws\" (UID: \"78a942e9-1895-4fd8-8428-d313e42deeef\") " pod="openshift-must-gather-f58j6/crc-debug-5pvws" Nov 27 12:11:50 crc kubenswrapper[4807]: I1127 12:11:50.358571 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78a942e9-1895-4fd8-8428-d313e42deeef-host\") pod \"crc-debug-5pvws\" (UID: \"78a942e9-1895-4fd8-8428-d313e42deeef\") " pod="openshift-must-gather-f58j6/crc-debug-5pvws" Nov 27 12:11:50 crc kubenswrapper[4807]: I1127 12:11:50.460522 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78a942e9-1895-4fd8-8428-d313e42deeef-host\") pod \"crc-debug-5pvws\" (UID: \"78a942e9-1895-4fd8-8428-d313e42deeef\") " pod="openshift-must-gather-f58j6/crc-debug-5pvws" Nov 27 12:11:50 crc kubenswrapper[4807]: I1127 12:11:50.460678 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78a942e9-1895-4fd8-8428-d313e42deeef-host\") pod \"crc-debug-5pvws\" (UID: \"78a942e9-1895-4fd8-8428-d313e42deeef\") " pod="openshift-must-gather-f58j6/crc-debug-5pvws" Nov 27 12:11:50 crc kubenswrapper[4807]: I1127 12:11:50.460683 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqp89\" (UniqueName: \"kubernetes.io/projected/78a942e9-1895-4fd8-8428-d313e42deeef-kube-api-access-zqp89\") pod \"crc-debug-5pvws\" (UID: \"78a942e9-1895-4fd8-8428-d313e42deeef\") " pod="openshift-must-gather-f58j6/crc-debug-5pvws" Nov 27 12:11:50 crc kubenswrapper[4807]: I1127 12:11:50.478597 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqp89\" (UniqueName: \"kubernetes.io/projected/78a942e9-1895-4fd8-8428-d313e42deeef-kube-api-access-zqp89\") pod \"crc-debug-5pvws\" (UID: \"78a942e9-1895-4fd8-8428-d313e42deeef\") " pod="openshift-must-gather-f58j6/crc-debug-5pvws" Nov 27 12:11:50 crc kubenswrapper[4807]: I1127 12:11:50.484434 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-5pvws" Nov 27 12:11:50 crc kubenswrapper[4807]: I1127 12:11:50.625898 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/crc-debug-5pvws" event={"ID":"78a942e9-1895-4fd8-8428-d313e42deeef","Type":"ContainerStarted","Data":"436ba71bc396603dbf78c37f7f8f36ad5189cc706b381162a65f6e7380c2c612"} Nov 27 12:11:51 crc kubenswrapper[4807]: I1127 12:11:51.637414 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/crc-debug-5pvws" event={"ID":"78a942e9-1895-4fd8-8428-d313e42deeef","Type":"ContainerStarted","Data":"b7a13278067b3e81b2da1e6dd0301ec24dff504783147b8119ad18e19f223f50"} Nov 27 12:11:51 crc kubenswrapper[4807]: I1127 12:11:51.657750 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f58j6/crc-debug-5pvws" podStartSLOduration=1.6577316930000001 podStartE2EDuration="1.657731693s" podCreationTimestamp="2025-11-27 12:11:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 12:11:51.649941096 +0000 UTC m=+3752.749439294" watchObservedRunningTime="2025-11-27 12:11:51.657731693 +0000 UTC m=+3752.757229891" Nov 27 12:12:20 crc kubenswrapper[4807]: I1127 12:12:20.922059 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 12:12:20 crc kubenswrapper[4807]: I1127 12:12:20.922615 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 12:12:22 crc kubenswrapper[4807]: I1127 12:12:22.904115 4807 generic.go:334] "Generic (PLEG): container finished" podID="78a942e9-1895-4fd8-8428-d313e42deeef" containerID="b7a13278067b3e81b2da1e6dd0301ec24dff504783147b8119ad18e19f223f50" exitCode=0 Nov 27 12:12:22 crc kubenswrapper[4807]: I1127 12:12:22.904214 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/crc-debug-5pvws" event={"ID":"78a942e9-1895-4fd8-8428-d313e42deeef","Type":"ContainerDied","Data":"b7a13278067b3e81b2da1e6dd0301ec24dff504783147b8119ad18e19f223f50"} Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.018834 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-5pvws" Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.053941 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f58j6/crc-debug-5pvws"] Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.064092 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f58j6/crc-debug-5pvws"] Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.208623 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqp89\" (UniqueName: \"kubernetes.io/projected/78a942e9-1895-4fd8-8428-d313e42deeef-kube-api-access-zqp89\") pod \"78a942e9-1895-4fd8-8428-d313e42deeef\" (UID: \"78a942e9-1895-4fd8-8428-d313e42deeef\") " Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.208864 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78a942e9-1895-4fd8-8428-d313e42deeef-host\") pod \"78a942e9-1895-4fd8-8428-d313e42deeef\" (UID: \"78a942e9-1895-4fd8-8428-d313e42deeef\") " Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.209302 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78a942e9-1895-4fd8-8428-d313e42deeef-host" (OuterVolumeSpecName: "host") pod "78a942e9-1895-4fd8-8428-d313e42deeef" (UID: "78a942e9-1895-4fd8-8428-d313e42deeef"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.210149 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/78a942e9-1895-4fd8-8428-d313e42deeef-host\") on node \"crc\" DevicePath \"\"" Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.214760 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a942e9-1895-4fd8-8428-d313e42deeef-kube-api-access-zqp89" (OuterVolumeSpecName: "kube-api-access-zqp89") pod "78a942e9-1895-4fd8-8428-d313e42deeef" (UID: "78a942e9-1895-4fd8-8428-d313e42deeef"). InnerVolumeSpecName "kube-api-access-zqp89". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.311938 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqp89\" (UniqueName: \"kubernetes.io/projected/78a942e9-1895-4fd8-8428-d313e42deeef-kube-api-access-zqp89\") on node \"crc\" DevicePath \"\"" Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.925220 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="436ba71bc396603dbf78c37f7f8f36ad5189cc706b381162a65f6e7380c2c612" Nov 27 12:12:24 crc kubenswrapper[4807]: I1127 12:12:24.925331 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-5pvws" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.373268 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f58j6/crc-debug-hgjg9"] Nov 27 12:12:25 crc kubenswrapper[4807]: E1127 12:12:25.373616 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78a942e9-1895-4fd8-8428-d313e42deeef" containerName="container-00" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.373629 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="78a942e9-1895-4fd8-8428-d313e42deeef" containerName="container-00" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.373820 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="78a942e9-1895-4fd8-8428-d313e42deeef" containerName="container-00" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.374381 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-hgjg9" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.496638 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/385da05b-805c-4927-a4a6-35e1e24913a0-host\") pod \"crc-debug-hgjg9\" (UID: \"385da05b-805c-4927-a4a6-35e1e24913a0\") " pod="openshift-must-gather-f58j6/crc-debug-hgjg9" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.497018 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf6qf\" (UniqueName: \"kubernetes.io/projected/385da05b-805c-4927-a4a6-35e1e24913a0-kube-api-access-sf6qf\") pod \"crc-debug-hgjg9\" (UID: \"385da05b-805c-4927-a4a6-35e1e24913a0\") " pod="openshift-must-gather-f58j6/crc-debug-hgjg9" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.543608 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a942e9-1895-4fd8-8428-d313e42deeef" path="/var/lib/kubelet/pods/78a942e9-1895-4fd8-8428-d313e42deeef/volumes" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.598817 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf6qf\" (UniqueName: \"kubernetes.io/projected/385da05b-805c-4927-a4a6-35e1e24913a0-kube-api-access-sf6qf\") pod \"crc-debug-hgjg9\" (UID: \"385da05b-805c-4927-a4a6-35e1e24913a0\") " pod="openshift-must-gather-f58j6/crc-debug-hgjg9" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.598923 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/385da05b-805c-4927-a4a6-35e1e24913a0-host\") pod \"crc-debug-hgjg9\" (UID: \"385da05b-805c-4927-a4a6-35e1e24913a0\") " pod="openshift-must-gather-f58j6/crc-debug-hgjg9" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.599105 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/385da05b-805c-4927-a4a6-35e1e24913a0-host\") pod \"crc-debug-hgjg9\" (UID: \"385da05b-805c-4927-a4a6-35e1e24913a0\") " pod="openshift-must-gather-f58j6/crc-debug-hgjg9" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.623997 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf6qf\" (UniqueName: \"kubernetes.io/projected/385da05b-805c-4927-a4a6-35e1e24913a0-kube-api-access-sf6qf\") pod \"crc-debug-hgjg9\" (UID: \"385da05b-805c-4927-a4a6-35e1e24913a0\") " pod="openshift-must-gather-f58j6/crc-debug-hgjg9" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.701800 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-hgjg9" Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.933569 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/crc-debug-hgjg9" event={"ID":"385da05b-805c-4927-a4a6-35e1e24913a0","Type":"ContainerStarted","Data":"49d6767c587a25896be808d45f804b755469c731d56af0db60ae8fdae184ec96"} Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.933617 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/crc-debug-hgjg9" event={"ID":"385da05b-805c-4927-a4a6-35e1e24913a0","Type":"ContainerStarted","Data":"6bad7ac5257ea037710947c910230e0d98f6c7b66698cc712eb381eed59327e7"} Nov 27 12:12:25 crc kubenswrapper[4807]: I1127 12:12:25.948457 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f58j6/crc-debug-hgjg9" podStartSLOduration=0.948439574 podStartE2EDuration="948.439574ms" podCreationTimestamp="2025-11-27 12:12:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 12:12:25.943864922 +0000 UTC m=+3787.043363130" watchObservedRunningTime="2025-11-27 12:12:25.948439574 +0000 UTC m=+3787.047937772" Nov 27 12:12:26 crc kubenswrapper[4807]: I1127 12:12:26.945793 4807 generic.go:334] "Generic (PLEG): container finished" podID="385da05b-805c-4927-a4a6-35e1e24913a0" containerID="49d6767c587a25896be808d45f804b755469c731d56af0db60ae8fdae184ec96" exitCode=0 Nov 27 12:12:26 crc kubenswrapper[4807]: I1127 12:12:26.945824 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/crc-debug-hgjg9" event={"ID":"385da05b-805c-4927-a4a6-35e1e24913a0","Type":"ContainerDied","Data":"49d6767c587a25896be808d45f804b755469c731d56af0db60ae8fdae184ec96"} Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.056584 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-hgjg9" Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.088235 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f58j6/crc-debug-hgjg9"] Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.096190 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f58j6/crc-debug-hgjg9"] Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.244278 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/385da05b-805c-4927-a4a6-35e1e24913a0-host\") pod \"385da05b-805c-4927-a4a6-35e1e24913a0\" (UID: \"385da05b-805c-4927-a4a6-35e1e24913a0\") " Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.244321 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf6qf\" (UniqueName: \"kubernetes.io/projected/385da05b-805c-4927-a4a6-35e1e24913a0-kube-api-access-sf6qf\") pod \"385da05b-805c-4927-a4a6-35e1e24913a0\" (UID: \"385da05b-805c-4927-a4a6-35e1e24913a0\") " Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.244475 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/385da05b-805c-4927-a4a6-35e1e24913a0-host" (OuterVolumeSpecName: "host") pod "385da05b-805c-4927-a4a6-35e1e24913a0" (UID: "385da05b-805c-4927-a4a6-35e1e24913a0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.244850 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/385da05b-805c-4927-a4a6-35e1e24913a0-host\") on node \"crc\" DevicePath \"\"" Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.252238 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385da05b-805c-4927-a4a6-35e1e24913a0-kube-api-access-sf6qf" (OuterVolumeSpecName: "kube-api-access-sf6qf") pod "385da05b-805c-4927-a4a6-35e1e24913a0" (UID: "385da05b-805c-4927-a4a6-35e1e24913a0"). InnerVolumeSpecName "kube-api-access-sf6qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.346288 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf6qf\" (UniqueName: \"kubernetes.io/projected/385da05b-805c-4927-a4a6-35e1e24913a0-kube-api-access-sf6qf\") on node \"crc\" DevicePath \"\"" Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.973013 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bad7ac5257ea037710947c910230e0d98f6c7b66698cc712eb381eed59327e7" Nov 27 12:12:28 crc kubenswrapper[4807]: I1127 12:12:28.973049 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-hgjg9" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.290164 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f58j6/crc-debug-fftsr"] Nov 27 12:12:29 crc kubenswrapper[4807]: E1127 12:12:29.290691 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385da05b-805c-4927-a4a6-35e1e24913a0" containerName="container-00" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.290707 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="385da05b-805c-4927-a4a6-35e1e24913a0" containerName="container-00" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.290973 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="385da05b-805c-4927-a4a6-35e1e24913a0" containerName="container-00" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.291715 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-fftsr" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.373870 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d2f45ee-1a23-4b70-adcb-761de0316b9d-host\") pod \"crc-debug-fftsr\" (UID: \"2d2f45ee-1a23-4b70-adcb-761de0316b9d\") " pod="openshift-must-gather-f58j6/crc-debug-fftsr" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.374215 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp5b5\" (UniqueName: \"kubernetes.io/projected/2d2f45ee-1a23-4b70-adcb-761de0316b9d-kube-api-access-tp5b5\") pod \"crc-debug-fftsr\" (UID: \"2d2f45ee-1a23-4b70-adcb-761de0316b9d\") " pod="openshift-must-gather-f58j6/crc-debug-fftsr" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.475551 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d2f45ee-1a23-4b70-adcb-761de0316b9d-host\") pod \"crc-debug-fftsr\" (UID: \"2d2f45ee-1a23-4b70-adcb-761de0316b9d\") " pod="openshift-must-gather-f58j6/crc-debug-fftsr" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.475619 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp5b5\" (UniqueName: \"kubernetes.io/projected/2d2f45ee-1a23-4b70-adcb-761de0316b9d-kube-api-access-tp5b5\") pod \"crc-debug-fftsr\" (UID: \"2d2f45ee-1a23-4b70-adcb-761de0316b9d\") " pod="openshift-must-gather-f58j6/crc-debug-fftsr" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.475735 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d2f45ee-1a23-4b70-adcb-761de0316b9d-host\") pod \"crc-debug-fftsr\" (UID: \"2d2f45ee-1a23-4b70-adcb-761de0316b9d\") " pod="openshift-must-gather-f58j6/crc-debug-fftsr" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.493788 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp5b5\" (UniqueName: \"kubernetes.io/projected/2d2f45ee-1a23-4b70-adcb-761de0316b9d-kube-api-access-tp5b5\") pod \"crc-debug-fftsr\" (UID: \"2d2f45ee-1a23-4b70-adcb-761de0316b9d\") " pod="openshift-must-gather-f58j6/crc-debug-fftsr" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.604823 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="385da05b-805c-4927-a4a6-35e1e24913a0" path="/var/lib/kubelet/pods/385da05b-805c-4927-a4a6-35e1e24913a0/volumes" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.620114 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-fftsr" Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.983995 4807 generic.go:334] "Generic (PLEG): container finished" podID="2d2f45ee-1a23-4b70-adcb-761de0316b9d" containerID="cda341924e9eb5aa8c58b6d2f512b86d8f8a3e7452e13e6158ece050be0f1e24" exitCode=0 Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.984097 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/crc-debug-fftsr" event={"ID":"2d2f45ee-1a23-4b70-adcb-761de0316b9d","Type":"ContainerDied","Data":"cda341924e9eb5aa8c58b6d2f512b86d8f8a3e7452e13e6158ece050be0f1e24"} Nov 27 12:12:29 crc kubenswrapper[4807]: I1127 12:12:29.984347 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/crc-debug-fftsr" event={"ID":"2d2f45ee-1a23-4b70-adcb-761de0316b9d","Type":"ContainerStarted","Data":"b30a88aea1b92fc6deaf72e5b2a50182d1e678af77e08a15c3d05d46d90f29be"} Nov 27 12:12:30 crc kubenswrapper[4807]: I1127 12:12:30.019723 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f58j6/crc-debug-fftsr"] Nov 27 12:12:30 crc kubenswrapper[4807]: I1127 12:12:30.027272 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f58j6/crc-debug-fftsr"] Nov 27 12:12:31 crc kubenswrapper[4807]: I1127 12:12:31.080423 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-fftsr" Nov 27 12:12:31 crc kubenswrapper[4807]: I1127 12:12:31.118631 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp5b5\" (UniqueName: \"kubernetes.io/projected/2d2f45ee-1a23-4b70-adcb-761de0316b9d-kube-api-access-tp5b5\") pod \"2d2f45ee-1a23-4b70-adcb-761de0316b9d\" (UID: \"2d2f45ee-1a23-4b70-adcb-761de0316b9d\") " Nov 27 12:12:31 crc kubenswrapper[4807]: I1127 12:12:31.118847 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d2f45ee-1a23-4b70-adcb-761de0316b9d-host\") pod \"2d2f45ee-1a23-4b70-adcb-761de0316b9d\" (UID: \"2d2f45ee-1a23-4b70-adcb-761de0316b9d\") " Nov 27 12:12:31 crc kubenswrapper[4807]: I1127 12:12:31.118994 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2d2f45ee-1a23-4b70-adcb-761de0316b9d-host" (OuterVolumeSpecName: "host") pod "2d2f45ee-1a23-4b70-adcb-761de0316b9d" (UID: "2d2f45ee-1a23-4b70-adcb-761de0316b9d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 27 12:12:31 crc kubenswrapper[4807]: I1127 12:12:31.119312 4807 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2d2f45ee-1a23-4b70-adcb-761de0316b9d-host\") on node \"crc\" DevicePath \"\"" Nov 27 12:12:31 crc kubenswrapper[4807]: I1127 12:12:31.136037 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2f45ee-1a23-4b70-adcb-761de0316b9d-kube-api-access-tp5b5" (OuterVolumeSpecName: "kube-api-access-tp5b5") pod "2d2f45ee-1a23-4b70-adcb-761de0316b9d" (UID: "2d2f45ee-1a23-4b70-adcb-761de0316b9d"). InnerVolumeSpecName "kube-api-access-tp5b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:12:31 crc kubenswrapper[4807]: I1127 12:12:31.221413 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp5b5\" (UniqueName: \"kubernetes.io/projected/2d2f45ee-1a23-4b70-adcb-761de0316b9d-kube-api-access-tp5b5\") on node \"crc\" DevicePath \"\"" Nov 27 12:12:31 crc kubenswrapper[4807]: I1127 12:12:31.543389 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2f45ee-1a23-4b70-adcb-761de0316b9d" path="/var/lib/kubelet/pods/2d2f45ee-1a23-4b70-adcb-761de0316b9d/volumes" Nov 27 12:12:32 crc kubenswrapper[4807]: I1127 12:12:32.003554 4807 scope.go:117] "RemoveContainer" containerID="cda341924e9eb5aa8c58b6d2f512b86d8f8a3e7452e13e6158ece050be0f1e24" Nov 27 12:12:32 crc kubenswrapper[4807]: I1127 12:12:32.003602 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/crc-debug-fftsr" Nov 27 12:12:50 crc kubenswrapper[4807]: I1127 12:12:50.921370 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 12:12:50 crc kubenswrapper[4807]: I1127 12:12:50.921996 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 12:12:54 crc kubenswrapper[4807]: I1127 12:12:54.256325 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b4c869746-crx9p_b3ebef31-c3b4-4d86-96b1-92bb2038fcc2/barbican-api/0.log" Nov 27 12:12:54 crc kubenswrapper[4807]: I1127 12:12:54.371209 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7b4c869746-crx9p_b3ebef31-c3b4-4d86-96b1-92bb2038fcc2/barbican-api-log/0.log" Nov 27 12:12:54 crc kubenswrapper[4807]: I1127 12:12:54.471044 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67945bfd5d-wnmj5_25b00dcc-1d4a-4d61-9865-db7b0515e360/barbican-keystone-listener/0.log" Nov 27 12:12:54 crc kubenswrapper[4807]: I1127 12:12:54.518962 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67945bfd5d-wnmj5_25b00dcc-1d4a-4d61-9865-db7b0515e360/barbican-keystone-listener-log/0.log" Nov 27 12:12:54 crc kubenswrapper[4807]: I1127 12:12:54.664889 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5df9b5c779-cqvbn_b22f0add-3876-4db6-a6ac-83bf95c37ea6/barbican-worker/0.log" Nov 27 12:12:54 crc kubenswrapper[4807]: I1127 12:12:54.712129 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5df9b5c779-cqvbn_b22f0add-3876-4db6-a6ac-83bf95c37ea6/barbican-worker-log/0.log" Nov 27 12:12:54 crc kubenswrapper[4807]: I1127 12:12:54.870837 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-xgdqb_a94d3cc9-c680-4f54-a2b6-0f55690f4cfa/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:12:54 crc kubenswrapper[4807]: I1127 12:12:54.950192 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2310e932-c289-4fe8-a5f9-ee9ce3ce915b/ceilometer-central-agent/0.log" Nov 27 12:12:55 crc kubenswrapper[4807]: I1127 12:12:55.030379 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2310e932-c289-4fe8-a5f9-ee9ce3ce915b/ceilometer-notification-agent/0.log" Nov 27 12:12:55 crc kubenswrapper[4807]: I1127 12:12:55.072846 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2310e932-c289-4fe8-a5f9-ee9ce3ce915b/proxy-httpd/0.log" Nov 27 12:12:55 crc kubenswrapper[4807]: I1127 12:12:55.163483 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2310e932-c289-4fe8-a5f9-ee9ce3ce915b/sg-core/0.log" Nov 27 12:12:55 crc kubenswrapper[4807]: I1127 12:12:55.281477 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_722777ce-cfa6-4b7d-96ba-452a6998356d/cinder-api/0.log" Nov 27 12:12:55 crc kubenswrapper[4807]: I1127 12:12:55.316353 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_722777ce-cfa6-4b7d-96ba-452a6998356d/cinder-api-log/0.log" Nov 27 12:12:55 crc kubenswrapper[4807]: I1127 12:12:55.481804 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_eb01b182-7b19-44ea-874b-3ad6a1ebb6a7/probe/0.log" Nov 27 12:12:55 crc kubenswrapper[4807]: I1127 12:12:55.498612 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_eb01b182-7b19-44ea-874b-3ad6a1ebb6a7/cinder-scheduler/0.log" Nov 27 12:12:55 crc kubenswrapper[4807]: I1127 12:12:55.625968 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-64std_193a1dcb-8e1d-4c2b-be8c-92a94a5dfb9d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:12:55 crc kubenswrapper[4807]: I1127 12:12:55.722224 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dkpzw_369043db-4f00-4bbd-ab16-6d8f27564af2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:12:55 crc kubenswrapper[4807]: I1127 12:12:55.858065 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-956t7_6f7c0ed3-d807-4035-b4c9-a2f906d06c46/init/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.064781 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2wl4x_7cfe00fa-307e-460b-a77e-a57439954c87/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.066317 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-956t7_6f7c0ed3-d807-4035-b4c9-a2f906d06c46/init/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.067902 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-956t7_6f7c0ed3-d807-4035-b4c9-a2f906d06c46/dnsmasq-dns/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.260185 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0a059d86-8a32-481a-80c7-e9675cb921b9/glance-log/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.277728 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0a059d86-8a32-481a-80c7-e9675cb921b9/glance-httpd/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.423006 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_04b42996-10c7-401c-b91b-e0ab4e100173/glance-httpd/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.449618 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_04b42996-10c7-401c-b91b-e0ab4e100173/glance-log/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.569357 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d69cff6fb-88t5t_ba9d500c-ec74-4755-924d-8b6160bb51dc/horizon/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.707535 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-9ksf2_e50793ea-c215-407b-ac8f-a5767166a0dd/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.899091 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7d69cff6fb-88t5t_ba9d500c-ec74-4755-924d-8b6160bb51dc/horizon-log/0.log" Nov 27 12:12:56 crc kubenswrapper[4807]: I1127 12:12:56.958355 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-xc7q6_b0775b68-e606-412d-a9b9-1f8eb98bbd63/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:12:57 crc kubenswrapper[4807]: I1127 12:12:57.239235 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29404081-8hnvz_d0a23453-ffa9-450e-a401-8f3c4a917196/keystone-cron/0.log" Nov 27 12:12:57 crc kubenswrapper[4807]: I1127 12:12:57.283064 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-54b9d76d5d-4mfvr_bc71ab7b-e861-46eb-ab9e-e45a4aafd76b/keystone-api/0.log" Nov 27 12:12:57 crc kubenswrapper[4807]: I1127 12:12:57.418671 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2744cb30-46c9-4f1e-a771-9bd30eefa50d/kube-state-metrics/0.log" Nov 27 12:12:57 crc kubenswrapper[4807]: I1127 12:12:57.506626 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rp7bm_36b0f83c-c6d3-4d4b-9675-478b3f02f952/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:12:57 crc kubenswrapper[4807]: I1127 12:12:57.849911 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-ff549ff99-zxxvk_311f3fc5-b5ab-4fd9-8146-7442b0b29409/neutron-api/0.log" Nov 27 12:12:57 crc kubenswrapper[4807]: I1127 12:12:57.874845 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-ff549ff99-zxxvk_311f3fc5-b5ab-4fd9-8146-7442b0b29409/neutron-httpd/0.log" Nov 27 12:12:58 crc kubenswrapper[4807]: I1127 12:12:58.119090 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-6z7hx_2c3d4cc1-1a11-4675-9ff2-1ee6fe346bfd/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:12:58 crc kubenswrapper[4807]: I1127 12:12:58.520692 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a071e484-2dfb-4bef-a538-69770c7f5f56/nova-api-log/0.log" Nov 27 12:12:58 crc kubenswrapper[4807]: I1127 12:12:58.562965 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_621dbc60-ba00-466f-8cbb-2e58611dff37/nova-cell0-conductor-conductor/0.log" Nov 27 12:12:58 crc kubenswrapper[4807]: I1127 12:12:58.776017 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a071e484-2dfb-4bef-a538-69770c7f5f56/nova-api-api/0.log" Nov 27 12:12:58 crc kubenswrapper[4807]: I1127 12:12:58.920999 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2f104b79-cd6b-4d1b-9ad9-a508e5ec636b/nova-cell1-conductor-conductor/0.log" Nov 27 12:12:58 crc kubenswrapper[4807]: I1127 12:12:58.965320 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_148fe221-9289-4661-9ac6-fa5eb6af9b7f/nova-cell1-novncproxy-novncproxy/0.log" Nov 27 12:12:59 crc kubenswrapper[4807]: I1127 12:12:59.011834 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-xnswz_08c2cd76-cfdb-4de6-ac04-8925b75415fa/nova-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:12:59 crc kubenswrapper[4807]: I1127 12:12:59.232332 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_088b7a89-396a-434c-b201-a7ecb96cb2e7/nova-metadata-log/0.log" Nov 27 12:12:59 crc kubenswrapper[4807]: I1127 12:12:59.511576 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6603c2ee-9ab6-476c-8db6-d073f0dec3aa/mysql-bootstrap/0.log" Nov 27 12:12:59 crc kubenswrapper[4807]: I1127 12:12:59.668141 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6603c2ee-9ab6-476c-8db6-d073f0dec3aa/mysql-bootstrap/0.log" Nov 27 12:12:59 crc kubenswrapper[4807]: I1127 12:12:59.690234 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6603c2ee-9ab6-476c-8db6-d073f0dec3aa/galera/0.log" Nov 27 12:12:59 crc kubenswrapper[4807]: I1127 12:12:59.704936 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_39c3409e-49b1-4dfd-ba16-005e9f6e5a44/nova-scheduler-scheduler/0.log" Nov 27 12:12:59 crc kubenswrapper[4807]: I1127 12:12:59.881833 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_122d837d-ee30-4e26-9e01-1f4bd8ebaace/mysql-bootstrap/0.log" Nov 27 12:13:00 crc kubenswrapper[4807]: I1127 12:13:00.012887 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_122d837d-ee30-4e26-9e01-1f4bd8ebaace/mysql-bootstrap/0.log" Nov 27 12:13:00 crc kubenswrapper[4807]: I1127 12:13:00.111884 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_122d837d-ee30-4e26-9e01-1f4bd8ebaace/galera/0.log" Nov 27 12:13:00 crc kubenswrapper[4807]: I1127 12:13:00.198089 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_99f92409-b35d-4905-bdac-488235b8c054/openstackclient/0.log" Nov 27 12:13:00 crc kubenswrapper[4807]: I1127 12:13:00.363644 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-64nw4_356f01bb-6304-499b-946d-1e9f3d6e7572/ovn-controller/0.log" Nov 27 12:13:00 crc kubenswrapper[4807]: I1127 12:13:00.519927 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_088b7a89-396a-434c-b201-a7ecb96cb2e7/nova-metadata-metadata/0.log" Nov 27 12:13:00 crc kubenswrapper[4807]: I1127 12:13:00.735611 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6pfcf_186b6a8f-d303-440b-99ea-6502bac3e583/openstack-network-exporter/0.log" Nov 27 12:13:00 crc kubenswrapper[4807]: I1127 12:13:00.832910 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-26rzj_9bac140b-1bac-4d27-bb66-111e66af1edf/ovsdb-server-init/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.001863 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-26rzj_9bac140b-1bac-4d27-bb66-111e66af1edf/ovs-vswitchd/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.043632 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-26rzj_9bac140b-1bac-4d27-bb66-111e66af1edf/ovsdb-server-init/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.053522 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-26rzj_9bac140b-1bac-4d27-bb66-111e66af1edf/ovsdb-server/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.261436 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f2270de6-a69c-44be-8fb7-98e10027cd34/openstack-network-exporter/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.266956 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-b2bfj_ef17b876-0c3b-4b6e-9ef6-9afcf4cde3db/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.362837 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f2270de6-a69c-44be-8fb7-98e10027cd34/ovn-northd/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.489281 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0a8b97df-a50b-4cce-8035-28b23cbdaf72/openstack-network-exporter/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.567405 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_0a8b97df-a50b-4cce-8035-28b23cbdaf72/ovsdbserver-nb/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.698136 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e47ac50c-3a93-46fd-94f2-5c83e02e1919/ovsdbserver-sb/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.727751 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_e47ac50c-3a93-46fd-94f2-5c83e02e1919/openstack-network-exporter/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.890300 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5449cc7d8-rpm6t_e4fc9fe4-54f1-458b-b2f7-ff20982e3243/placement-api/0.log" Nov 27 12:13:01 crc kubenswrapper[4807]: I1127 12:13:01.964216 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5449cc7d8-rpm6t_e4fc9fe4-54f1-458b-b2f7-ff20982e3243/placement-log/0.log" Nov 27 12:13:02 crc kubenswrapper[4807]: I1127 12:13:02.086978 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c679115a-3605-4e24-8d75-553d53d87f48/setup-container/0.log" Nov 27 12:13:02 crc kubenswrapper[4807]: I1127 12:13:02.158426 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c679115a-3605-4e24-8d75-553d53d87f48/setup-container/0.log" Nov 27 12:13:02 crc kubenswrapper[4807]: I1127 12:13:02.219418 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c679115a-3605-4e24-8d75-553d53d87f48/rabbitmq/0.log" Nov 27 12:13:02 crc kubenswrapper[4807]: I1127 12:13:02.288113 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2dc733a-0951-4580-a301-d0dd7d7937f1/setup-container/0.log" Nov 27 12:13:02 crc kubenswrapper[4807]: I1127 12:13:02.537667 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2dc733a-0951-4580-a301-d0dd7d7937f1/setup-container/0.log" Nov 27 12:13:02 crc kubenswrapper[4807]: I1127 12:13:02.628500 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-j2rcb_cff69888-3585-4127-a2e6-122a7fdfe894/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:13:02 crc kubenswrapper[4807]: I1127 12:13:02.650317 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2dc733a-0951-4580-a301-d0dd7d7937f1/rabbitmq/0.log" Nov 27 12:13:02 crc kubenswrapper[4807]: I1127 12:13:02.866117 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-xwp2h_edca8731-6d7e-44e5-b2a3-8622578409df/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:13:02 crc kubenswrapper[4807]: I1127 12:13:02.888071 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ss8zh_c530987e-af49-45dc-ae6e-13c19df75606/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:13:03 crc kubenswrapper[4807]: I1127 12:13:03.087254 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-bdw67_345077b8-ac19-43cb-8eee-e6112034320c/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:13:03 crc kubenswrapper[4807]: I1127 12:13:03.121855 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x8578_f628332e-750f-45bd-994e-fcd01490e1e5/ssh-known-hosts-edpm-deployment/0.log" Nov 27 12:13:03 crc kubenswrapper[4807]: I1127 12:13:03.355626 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b6fc97755-xnlzr_ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257/proxy-server/0.log" Nov 27 12:13:03 crc kubenswrapper[4807]: I1127 12:13:03.424375 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-7b6fc97755-xnlzr_ae8a2ba3-f32b-447f-9b4d-7fbd18ac5257/proxy-httpd/0.log" Nov 27 12:13:03 crc kubenswrapper[4807]: I1127 12:13:03.531019 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-6hg29_b83dff2f-801e-4a9b-9427-48e1f51bcc79/swift-ring-rebalance/0.log" Nov 27 12:13:03 crc kubenswrapper[4807]: I1127 12:13:03.633971 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/account-auditor/0.log" Nov 27 12:13:03 crc kubenswrapper[4807]: I1127 12:13:03.809113 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/account-reaper/0.log" Nov 27 12:13:03 crc kubenswrapper[4807]: I1127 12:13:03.915538 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/account-replicator/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.005043 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/account-server/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.097849 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/container-auditor/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.124297 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/container-server/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.127574 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/container-replicator/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.285295 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/container-updater/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.335645 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/object-auditor/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.373318 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/object-expirer/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.403160 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/object-replicator/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.510546 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/object-updater/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.590680 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/object-server/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.641919 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/rsync/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.678388 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_bc29fb6b-2886-4d51-8afd-be8fc1069ee4/swift-recon-cron/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.845186 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-w4ggh_8a22c7d6-438a-499e-80d0-384ea7d2ec15/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:13:04 crc kubenswrapper[4807]: I1127 12:13:04.876478 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_203f3a06-5cde-4778-837a-90fbfde39772/tempest-tests-tempest-tests-runner/0.log" Nov 27 12:13:05 crc kubenswrapper[4807]: I1127 12:13:05.027419 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e67a9ba6-daee-4e19-bf83-d51152329c5c/test-operator-logs-container/0.log" Nov 27 12:13:05 crc kubenswrapper[4807]: I1127 12:13:05.063805 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-lt95g_e971f91a-7313-4149-af78-554da58f81e1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Nov 27 12:13:13 crc kubenswrapper[4807]: I1127 12:13:13.510093 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8ecce491-4a06-4922-8353-0586ac99471b/memcached/0.log" Nov 27 12:13:20 crc kubenswrapper[4807]: I1127 12:13:20.921911 4807 patch_prober.go:28] interesting pod/machine-config-daemon-kk425 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 27 12:13:20 crc kubenswrapper[4807]: I1127 12:13:20.922495 4807 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 27 12:13:20 crc kubenswrapper[4807]: I1127 12:13:20.922558 4807 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kk425" Nov 27 12:13:20 crc kubenswrapper[4807]: I1127 12:13:20.923342 4807 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91"} pod="openshift-machine-config-operator/machine-config-daemon-kk425" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 27 12:13:20 crc kubenswrapper[4807]: I1127 12:13:20.923391 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerName="machine-config-daemon" containerID="cri-o://e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" gracePeriod=600 Nov 27 12:13:21 crc kubenswrapper[4807]: E1127 12:13:21.056964 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:13:21 crc kubenswrapper[4807]: I1127 12:13:21.418873 4807 generic.go:334] "Generic (PLEG): container finished" podID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" exitCode=0 Nov 27 12:13:21 crc kubenswrapper[4807]: I1127 12:13:21.418955 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerDied","Data":"e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91"} Nov 27 12:13:21 crc kubenswrapper[4807]: I1127 12:13:21.420000 4807 scope.go:117] "RemoveContainer" containerID="e6ad55be444bb836094e989710fe293e7e4b3c2a590dc0117eb67fcd7f9ef509" Nov 27 12:13:21 crc kubenswrapper[4807]: I1127 12:13:21.422427 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:13:21 crc kubenswrapper[4807]: E1127 12:13:21.422915 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:13:27 crc kubenswrapper[4807]: I1127 12:13:27.905077 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-z6w5s_7377040f-fbf5-4395-a903-99dbb10dbcac/kube-rbac-proxy/0.log" Nov 27 12:13:27 crc kubenswrapper[4807]: I1127 12:13:27.960180 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b64f4fb85-z6w5s_7377040f-fbf5-4395-a903-99dbb10dbcac/manager/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.081020 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-6sqtw_623644bf-2d87-4689-acea-cfaeca90285f/kube-rbac-proxy/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.140270 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6b7f75547b-6sqtw_623644bf-2d87-4689-acea-cfaeca90285f/manager/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.251661 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/util/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.372143 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/util/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.425209 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/pull/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.426612 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/pull/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.568902 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/util/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.574381 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/pull/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.612094 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_d9f8f83c8c15c9e92d91ce663afd0a0b64ef58ec4c607c010f0e58ab265xt4h_3ae63091-3a7b-4708-82c7-d59383b22b9b/extract/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.745949 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-5wfgl_574b2edd-5058-4d84-a8b8-72258c3c9f7b/kube-rbac-proxy/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.781924 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-955677c94-5wfgl_574b2edd-5058-4d84-a8b8-72258c3c9f7b/manager/0.log" Nov 27 12:13:28 crc kubenswrapper[4807]: I1127 12:13:28.818494 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-v9d6j_5ae030e1-b973-4137-abd1-1abc5f5d1153/kube-rbac-proxy/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.029211 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-589cbd6b5b-v9d6j_5ae030e1-b973-4137-abd1-1abc5f5d1153/manager/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.049979 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-j2tq6_c5b9cfda-ea17-4add-a121-036a989efeab/manager/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.054815 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5b77f656f-j2tq6_c5b9cfda-ea17-4add-a121-036a989efeab/kube-rbac-proxy/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.192208 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-fztl6_4ae17b3e-8de9-45e3-8404-2f2fda6c6b99/kube-rbac-proxy/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.241519 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d494799bf-fztl6_4ae17b3e-8de9-45e3-8404-2f2fda6c6b99/manager/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.393219 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-qg4bq_4be44e13-06b8-494e-8a62-7e8d8747692f/kube-rbac-proxy/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.447644 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-xtdbs_9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b/kube-rbac-proxy/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.509747 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-qg4bq_4be44e13-06b8-494e-8a62-7e8d8747692f/manager/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.568336 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-67cb4dc6d4-xtdbs_9262ad56-c1b8-41ee-ab6b-1b3c57dbdb5b/manager/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.656623 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-lz6lg_af2f67ab-040b-4ec1-bf21-db83dcaeb6d2/kube-rbac-proxy/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.734395 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7b4567c7cf-lz6lg_af2f67ab-040b-4ec1-bf21-db83dcaeb6d2/manager/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.832401 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-hcnzc_3ae6d3a5-8999-4c3d-a3de-b497ae0776f2/kube-rbac-proxy/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.857108 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5d499bf58b-hcnzc_3ae6d3a5-8999-4c3d-a3de-b497ae0776f2/manager/0.log" Nov 27 12:13:29 crc kubenswrapper[4807]: I1127 12:13:29.994191 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-5wb7l_fe4ff55b-a2dd-4936-9016-d73ade2388a0/kube-rbac-proxy/0.log" Nov 27 12:13:30 crc kubenswrapper[4807]: I1127 12:13:30.054920 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-66f4dd4bc7-5wb7l_fe4ff55b-a2dd-4936-9016-d73ade2388a0/manager/0.log" Nov 27 12:13:30 crc kubenswrapper[4807]: I1127 12:13:30.129398 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-xfrls_961001c9-3719-4306-8d38-b3c5d8e202bc/kube-rbac-proxy/0.log" Nov 27 12:13:30 crc kubenswrapper[4807]: I1127 12:13:30.245820 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6fdcddb789-xfrls_961001c9-3719-4306-8d38-b3c5d8e202bc/manager/0.log" Nov 27 12:13:30 crc kubenswrapper[4807]: I1127 12:13:30.285729 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-6l4tm_dcfd531a-2394-41c7-b05a-5b8e95f8459c/kube-rbac-proxy/0.log" Nov 27 12:13:30 crc kubenswrapper[4807]: I1127 12:13:30.391166 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-79556f57fc-6l4tm_dcfd531a-2394-41c7-b05a-5b8e95f8459c/manager/0.log" Nov 27 12:13:30 crc kubenswrapper[4807]: I1127 12:13:30.442064 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-k82xf_1b313bc2-c896-486c-a520-9843ec7bd6ad/kube-rbac-proxy/0.log" Nov 27 12:13:30 crc kubenswrapper[4807]: I1127 12:13:30.529245 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-64cdc6ff96-k82xf_1b313bc2-c896-486c-a520-9843ec7bd6ad/manager/0.log" Nov 27 12:13:30 crc kubenswrapper[4807]: I1127 12:13:30.648626 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2_0bde1253-53c0-4864-b22e-dcf25751388e/kube-rbac-proxy/0.log" Nov 27 12:13:30 crc kubenswrapper[4807]: I1127 12:13:30.680860 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5fcdb54b6bl2xj2_0bde1253-53c0-4864-b22e-dcf25751388e/manager/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.059390 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-z2sjb_c9606905-c5fd-4ebe-9942-b013364d7ca8/registry-server/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.081683 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-59f78dbdf9-fzjdb_ffe745e4-da98-4391-990b-a86d2fbc3346/operator/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.184992 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-d9mj8_787c342b-413f-495a-8b31-bd8a01f35c3a/kube-rbac-proxy/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.321984 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-56897c768d-d9mj8_787c342b-413f-495a-8b31-bd8a01f35c3a/manager/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.512549 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-cw9bt_a0e8d2a3-0f58-4a1d-9867-648001196d2e/kube-rbac-proxy/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.515840 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57988cc5b5-cw9bt_a0e8d2a3-0f58-4a1d-9867-648001196d2e/manager/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.646118 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-l72vc_8dd596e5-e21b-4cae-bb6c-c7c2b1d09c91/operator/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.718391 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-pd7pj_94062b2f-3f5a-404d-9b0a-8b7f858e1322/kube-rbac-proxy/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.830491 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d77b94747-pd7pj_94062b2f-3f5a-404d-9b0a-8b7f858e1322/manager/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.911831 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6456fcdb48-tjnrt_649aedb9-ad77-47fa-a7e9-89cb12c65928/manager/0.log" Nov 27 12:13:31 crc kubenswrapper[4807]: I1127 12:13:31.917805 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-svq2j_5b554316-8e33-4fa8-a340-91d9e0f6b0de/kube-rbac-proxy/0.log" Nov 27 12:13:32 crc kubenswrapper[4807]: I1127 12:13:32.066363 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-mw4mw_7fbca001-90e9-4da2-bd14-6bc00a48ed40/kube-rbac-proxy/0.log" Nov 27 12:13:32 crc kubenswrapper[4807]: I1127 12:13:32.071019 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-svq2j_5b554316-8e33-4fa8-a340-91d9e0f6b0de/manager/0.log" Nov 27 12:13:32 crc kubenswrapper[4807]: I1127 12:13:32.092439 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5cd6c7f4c8-mw4mw_7fbca001-90e9-4da2-bd14-6bc00a48ed40/manager/0.log" Nov 27 12:13:32 crc kubenswrapper[4807]: I1127 12:13:32.213061 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-tj4db_13749acc-f727-4c3a-b24a-b56bd6b7533d/kube-rbac-proxy/0.log" Nov 27 12:13:32 crc kubenswrapper[4807]: I1127 12:13:32.238582 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-656dcb59d4-tj4db_13749acc-f727-4c3a-b24a-b56bd6b7533d/manager/0.log" Nov 27 12:13:34 crc kubenswrapper[4807]: I1127 12:13:34.533061 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:13:34 crc kubenswrapper[4807]: E1127 12:13:34.533950 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:13:49 crc kubenswrapper[4807]: I1127 12:13:49.540969 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:13:49 crc kubenswrapper[4807]: E1127 12:13:49.541742 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:13:50 crc kubenswrapper[4807]: I1127 12:13:50.446732 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xrg5s_8a96dd38-5283-4cea-a3d4-623c6a5191a6/control-plane-machine-set-operator/0.log" Nov 27 12:13:50 crc kubenswrapper[4807]: I1127 12:13:50.647164 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pvv9r_d0fee666-2d95-4330-a8aa-4ab1ca30bb5f/machine-api-operator/0.log" Nov 27 12:13:50 crc kubenswrapper[4807]: I1127 12:13:50.653830 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-pvv9r_d0fee666-2d95-4330-a8aa-4ab1ca30bb5f/kube-rbac-proxy/0.log" Nov 27 12:14:01 crc kubenswrapper[4807]: I1127 12:14:01.532159 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:14:01 crc kubenswrapper[4807]: E1127 12:14:01.532817 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:14:02 crc kubenswrapper[4807]: I1127 12:14:02.668128 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-czgf5_bc3f48f0-2d12-4c07-bfb4-20914aeaf910/cert-manager-controller/0.log" Nov 27 12:14:02 crc kubenswrapper[4807]: I1127 12:14:02.872422 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-cbsmh_8285736f-5d32-4503-94dd-f3e7c5d6a8f0/cert-manager-cainjector/0.log" Nov 27 12:14:03 crc kubenswrapper[4807]: I1127 12:14:03.048442 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-nwhxv_622a1ad1-bedf-4836-aa6e-0257f4694ae9/cert-manager-webhook/0.log" Nov 27 12:14:14 crc kubenswrapper[4807]: I1127 12:14:14.532372 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:14:14 crc kubenswrapper[4807]: E1127 12:14:14.533266 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:14:15 crc kubenswrapper[4807]: I1127 12:14:15.945175 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-rvf44_902faa80-4625-495f-a0a9-b94bc50eae67/nmstate-console-plugin/0.log" Nov 27 12:14:16 crc kubenswrapper[4807]: I1127 12:14:16.158804 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-sm6qf_8ab1549f-5eb1-4dee-be4b-c3ed2ce50f60/nmstate-handler/0.log" Nov 27 12:14:16 crc kubenswrapper[4807]: I1127 12:14:16.221400 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-rcrm6_18345e30-1bdc-47bf-8e02-16cf6c3f1bb1/nmstate-metrics/0.log" Nov 27 12:14:16 crc kubenswrapper[4807]: I1127 12:14:16.237875 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-rcrm6_18345e30-1bdc-47bf-8e02-16cf6c3f1bb1/kube-rbac-proxy/0.log" Nov 27 12:14:16 crc kubenswrapper[4807]: I1127 12:14:16.474432 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-6cwxp_f99de4c6-acb8-40aa-8c9f-c450de947993/nmstate-operator/0.log" Nov 27 12:14:16 crc kubenswrapper[4807]: I1127 12:14:16.537179 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-w7q6m_57fe27be-6097-4ef2-ac4a-2ff1625005a9/nmstate-webhook/0.log" Nov 27 12:14:29 crc kubenswrapper[4807]: I1127 12:14:29.549704 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:14:29 crc kubenswrapper[4807]: E1127 12:14:29.550574 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.307717 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-pkvz6_cfa209ab-1103-43a9-88e3-b7dd7048b2a6/kube-rbac-proxy/0.log" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.373461 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-pkvz6_cfa209ab-1103-43a9-88e3-b7dd7048b2a6/controller/0.log" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.500705 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-frr-files/0.log" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.698111 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-metrics/0.log" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.707915 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-reloader/0.log" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.724144 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-frr-files/0.log" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.744843 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-reloader/0.log" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.895364 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-frr-files/0.log" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.912196 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-reloader/0.log" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.926991 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-metrics/0.log" Nov 27 12:14:31 crc kubenswrapper[4807]: I1127 12:14:31.978571 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-metrics/0.log" Nov 27 12:14:32 crc kubenswrapper[4807]: I1127 12:14:32.148625 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-reloader/0.log" Nov 27 12:14:32 crc kubenswrapper[4807]: I1127 12:14:32.180346 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/controller/0.log" Nov 27 12:14:32 crc kubenswrapper[4807]: I1127 12:14:32.202021 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-frr-files/0.log" Nov 27 12:14:32 crc kubenswrapper[4807]: I1127 12:14:32.219619 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/cp-metrics/0.log" Nov 27 12:14:32 crc kubenswrapper[4807]: I1127 12:14:32.375980 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/kube-rbac-proxy/0.log" Nov 27 12:14:32 crc kubenswrapper[4807]: I1127 12:14:32.405465 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/frr-metrics/0.log" Nov 27 12:14:32 crc kubenswrapper[4807]: I1127 12:14:32.438827 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/kube-rbac-proxy-frr/0.log" Nov 27 12:14:32 crc kubenswrapper[4807]: I1127 12:14:32.577561 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/reloader/0.log" Nov 27 12:14:32 crc kubenswrapper[4807]: I1127 12:14:32.647856 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-xksm8_43f0dfbf-ad37-403d-968c-852dec2e09a0/frr-k8s-webhook-server/0.log" Nov 27 12:14:32 crc kubenswrapper[4807]: I1127 12:14:32.878919 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7dd7bd5d6c-lvbpn_fd79b824-f426-4793-b5e4-b351642047f5/manager/0.log" Nov 27 12:14:33 crc kubenswrapper[4807]: I1127 12:14:33.062957 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-64bbfd4bf8-wlg8q_dfbac2a4-3c47-4ac8-8643-c322886121d4/webhook-server/0.log" Nov 27 12:14:33 crc kubenswrapper[4807]: I1127 12:14:33.078313 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-chfn8_6bdffc3f-68a4-4eb0-a4a7-725db327ea08/kube-rbac-proxy/0.log" Nov 27 12:14:33 crc kubenswrapper[4807]: I1127 12:14:33.633735 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5rlln_0ddec633-d788-4b9f-afe6-c059e3c7f2e5/frr/0.log" Nov 27 12:14:33 crc kubenswrapper[4807]: I1127 12:14:33.700721 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-chfn8_6bdffc3f-68a4-4eb0-a4a7-725db327ea08/speaker/0.log" Nov 27 12:14:40 crc kubenswrapper[4807]: I1127 12:14:40.532371 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:14:40 crc kubenswrapper[4807]: E1127 12:14:40.533050 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:14:47 crc kubenswrapper[4807]: I1127 12:14:47.213304 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/util/0.log" Nov 27 12:14:47 crc kubenswrapper[4807]: I1127 12:14:47.398032 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/pull/0.log" Nov 27 12:14:47 crc kubenswrapper[4807]: I1127 12:14:47.442691 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/pull/0.log" Nov 27 12:14:47 crc kubenswrapper[4807]: I1127 12:14:47.446376 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/util/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.133013 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/util/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.163729 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/pull/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.222632 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212ftstwr_11676299-32d9-41ed-92c6-7e3d55378519/extract/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.323030 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/util/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.439487 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/util/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.488464 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/pull/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.506807 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/pull/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.622149 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/pull/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.658983 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/util/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.714042 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83dlcgw_f9cf2b1d-920e-4c71-8180-3a944a6b745a/extract/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.769442 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-utilities/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.956757 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-content/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.962516 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-content/0.log" Nov 27 12:14:48 crc kubenswrapper[4807]: I1127 12:14:48.994868 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-utilities/0.log" Nov 27 12:14:49 crc kubenswrapper[4807]: I1127 12:14:49.128492 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-content/0.log" Nov 27 12:14:49 crc kubenswrapper[4807]: I1127 12:14:49.169612 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/extract-utilities/0.log" Nov 27 12:14:49 crc kubenswrapper[4807]: I1127 12:14:49.322935 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-utilities/0.log" Nov 27 12:14:49 crc kubenswrapper[4807]: I1127 12:14:49.580466 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-content/0.log" Nov 27 12:14:49 crc kubenswrapper[4807]: I1127 12:14:49.595544 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hs4rb_fc334283-2738-4bcb-9ad6-b0654ceb5032/registry-server/0.log" Nov 27 12:14:49 crc kubenswrapper[4807]: I1127 12:14:49.612216 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-content/0.log" Nov 27 12:14:49 crc kubenswrapper[4807]: I1127 12:14:49.656870 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-utilities/0.log" Nov 27 12:14:49 crc kubenswrapper[4807]: I1127 12:14:49.837780 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-utilities/0.log" Nov 27 12:14:49 crc kubenswrapper[4807]: I1127 12:14:49.879259 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/extract-content/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.052848 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/3.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.146703 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-q6pz5_c5121ac2-4e63-4d46-b899-89bbfbb19550/marketplace-operator/2.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.263845 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-utilities/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.305910 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-wv2x5_c5e11b2e-3ee9-4fb1-a1e6-de7d28ca79e4/registry-server/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.443145 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-content/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.445273 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-utilities/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.447155 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-content/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.619706 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-utilities/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.625095 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/extract-content/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.629728 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-utilities/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.754309 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4bvrf_2b1d58e4-b0b5-4213-a286-5237808fe138/registry-server/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.806451 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-utilities/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.836299 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-content/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.855482 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-content/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.989663 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-content/0.log" Nov 27 12:14:50 crc kubenswrapper[4807]: I1127 12:14:50.991006 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/extract-utilities/0.log" Nov 27 12:14:51 crc kubenswrapper[4807]: I1127 12:14:51.545914 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hmsvr_e6063b43-7eea-4c80-aa97-a65aa6790390/registry-server/0.log" Nov 27 12:14:52 crc kubenswrapper[4807]: I1127 12:14:52.533337 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:14:52 crc kubenswrapper[4807]: E1127 12:14:52.533863 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.181893 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x"] Nov 27 12:15:00 crc kubenswrapper[4807]: E1127 12:15:00.183032 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2f45ee-1a23-4b70-adcb-761de0316b9d" containerName="container-00" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.183056 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2f45ee-1a23-4b70-adcb-761de0316b9d" containerName="container-00" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.183373 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2f45ee-1a23-4b70-adcb-761de0316b9d" containerName="container-00" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.184153 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.186703 4807 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.186722 4807 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.199516 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x"] Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.248775 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtccz\" (UniqueName: \"kubernetes.io/projected/87de7d87-edff-4b46-be23-cc1807a3bd0b-kube-api-access-rtccz\") pod \"collect-profiles-29404095-2b28x\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.248896 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87de7d87-edff-4b46-be23-cc1807a3bd0b-config-volume\") pod \"collect-profiles-29404095-2b28x\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.248968 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87de7d87-edff-4b46-be23-cc1807a3bd0b-secret-volume\") pod \"collect-profiles-29404095-2b28x\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.350035 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87de7d87-edff-4b46-be23-cc1807a3bd0b-secret-volume\") pod \"collect-profiles-29404095-2b28x\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.350170 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtccz\" (UniqueName: \"kubernetes.io/projected/87de7d87-edff-4b46-be23-cc1807a3bd0b-kube-api-access-rtccz\") pod \"collect-profiles-29404095-2b28x\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.350228 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87de7d87-edff-4b46-be23-cc1807a3bd0b-config-volume\") pod \"collect-profiles-29404095-2b28x\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.351038 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87de7d87-edff-4b46-be23-cc1807a3bd0b-config-volume\") pod \"collect-profiles-29404095-2b28x\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.364498 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87de7d87-edff-4b46-be23-cc1807a3bd0b-secret-volume\") pod \"collect-profiles-29404095-2b28x\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.368363 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtccz\" (UniqueName: \"kubernetes.io/projected/87de7d87-edff-4b46-be23-cc1807a3bd0b-kube-api-access-rtccz\") pod \"collect-profiles-29404095-2b28x\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.502801 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:00 crc kubenswrapper[4807]: I1127 12:15:00.961545 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x"] Nov 27 12:15:01 crc kubenswrapper[4807]: I1127 12:15:01.402290 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" event={"ID":"87de7d87-edff-4b46-be23-cc1807a3bd0b","Type":"ContainerStarted","Data":"6bc4c5d9a888bcde5c8c7c574772653c4a63ba1bb8255744204ed06696f66125"} Nov 27 12:15:01 crc kubenswrapper[4807]: I1127 12:15:01.402580 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" event={"ID":"87de7d87-edff-4b46-be23-cc1807a3bd0b","Type":"ContainerStarted","Data":"b8bf8b7cb2c9d347bb8d1d2f65a1dd879e42407f791468de0d9d828166c847e8"} Nov 27 12:15:01 crc kubenswrapper[4807]: I1127 12:15:01.423497 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" podStartSLOduration=1.423473641 podStartE2EDuration="1.423473641s" podCreationTimestamp="2025-11-27 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-27 12:15:01.421505058 +0000 UTC m=+3942.521003246" watchObservedRunningTime="2025-11-27 12:15:01.423473641 +0000 UTC m=+3942.522971839" Nov 27 12:15:02 crc kubenswrapper[4807]: I1127 12:15:02.411761 4807 generic.go:334] "Generic (PLEG): container finished" podID="87de7d87-edff-4b46-be23-cc1807a3bd0b" containerID="6bc4c5d9a888bcde5c8c7c574772653c4a63ba1bb8255744204ed06696f66125" exitCode=0 Nov 27 12:15:02 crc kubenswrapper[4807]: I1127 12:15:02.411803 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" event={"ID":"87de7d87-edff-4b46-be23-cc1807a3bd0b","Type":"ContainerDied","Data":"6bc4c5d9a888bcde5c8c7c574772653c4a63ba1bb8255744204ed06696f66125"} Nov 27 12:15:03 crc kubenswrapper[4807]: I1127 12:15:03.533023 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:15:03 crc kubenswrapper[4807]: E1127 12:15:03.533580 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.330610 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.359359 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87de7d87-edff-4b46-be23-cc1807a3bd0b-secret-volume\") pod \"87de7d87-edff-4b46-be23-cc1807a3bd0b\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.359581 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtccz\" (UniqueName: \"kubernetes.io/projected/87de7d87-edff-4b46-be23-cc1807a3bd0b-kube-api-access-rtccz\") pod \"87de7d87-edff-4b46-be23-cc1807a3bd0b\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.359621 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87de7d87-edff-4b46-be23-cc1807a3bd0b-config-volume\") pod \"87de7d87-edff-4b46-be23-cc1807a3bd0b\" (UID: \"87de7d87-edff-4b46-be23-cc1807a3bd0b\") " Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.364984 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87de7d87-edff-4b46-be23-cc1807a3bd0b-config-volume" (OuterVolumeSpecName: "config-volume") pod "87de7d87-edff-4b46-be23-cc1807a3bd0b" (UID: "87de7d87-edff-4b46-be23-cc1807a3bd0b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.377134 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87de7d87-edff-4b46-be23-cc1807a3bd0b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "87de7d87-edff-4b46-be23-cc1807a3bd0b" (UID: "87de7d87-edff-4b46-be23-cc1807a3bd0b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.383669 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87de7d87-edff-4b46-be23-cc1807a3bd0b-kube-api-access-rtccz" (OuterVolumeSpecName: "kube-api-access-rtccz") pod "87de7d87-edff-4b46-be23-cc1807a3bd0b" (UID: "87de7d87-edff-4b46-be23-cc1807a3bd0b"). InnerVolumeSpecName "kube-api-access-rtccz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.435064 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" event={"ID":"87de7d87-edff-4b46-be23-cc1807a3bd0b","Type":"ContainerDied","Data":"b8bf8b7cb2c9d347bb8d1d2f65a1dd879e42407f791468de0d9d828166c847e8"} Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.435108 4807 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8bf8b7cb2c9d347bb8d1d2f65a1dd879e42407f791468de0d9d828166c847e8" Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.435138 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29404095-2b28x" Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.462561 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtccz\" (UniqueName: \"kubernetes.io/projected/87de7d87-edff-4b46-be23-cc1807a3bd0b-kube-api-access-rtccz\") on node \"crc\" DevicePath \"\"" Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.462599 4807 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87de7d87-edff-4b46-be23-cc1807a3bd0b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.462609 4807 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87de7d87-edff-4b46-be23-cc1807a3bd0b-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.481939 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7"] Nov 27 12:15:04 crc kubenswrapper[4807]: I1127 12:15:04.490654 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29404050-9llx7"] Nov 27 12:15:05 crc kubenswrapper[4807]: I1127 12:15:05.544701 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4550963-890e-4525-901a-e9b520618ff8" path="/var/lib/kubelet/pods/b4550963-890e-4525-901a-e9b520618ff8/volumes" Nov 27 12:15:16 crc kubenswrapper[4807]: I1127 12:15:16.532268 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:15:16 crc kubenswrapper[4807]: E1127 12:15:16.533090 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:15:26 crc kubenswrapper[4807]: I1127 12:15:26.800512 4807 scope.go:117] "RemoveContainer" containerID="d365fdc122d406c370af34c79f0b84f63526ae27c969d680055a446b6a3b4a56" Nov 27 12:15:30 crc kubenswrapper[4807]: I1127 12:15:30.532500 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:15:30 crc kubenswrapper[4807]: E1127 12:15:30.533222 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.588316 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hqqdp"] Nov 27 12:15:42 crc kubenswrapper[4807]: E1127 12:15:42.591023 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87de7d87-edff-4b46-be23-cc1807a3bd0b" containerName="collect-profiles" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.591586 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="87de7d87-edff-4b46-be23-cc1807a3bd0b" containerName="collect-profiles" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.591928 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="87de7d87-edff-4b46-be23-cc1807a3bd0b" containerName="collect-profiles" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.595603 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.618307 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-catalog-content\") pod \"redhat-operators-hqqdp\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.618755 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-utilities\") pod \"redhat-operators-hqqdp\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.618822 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbm5m\" (UniqueName: \"kubernetes.io/projected/4b3c67f2-3a77-4bff-9430-722c98d31f95-kube-api-access-sbm5m\") pod \"redhat-operators-hqqdp\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.623265 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqqdp"] Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.720907 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-utilities\") pod \"redhat-operators-hqqdp\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.721372 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbm5m\" (UniqueName: \"kubernetes.io/projected/4b3c67f2-3a77-4bff-9430-722c98d31f95-kube-api-access-sbm5m\") pod \"redhat-operators-hqqdp\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.721528 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-utilities\") pod \"redhat-operators-hqqdp\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.721723 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-catalog-content\") pod \"redhat-operators-hqqdp\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.722038 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-catalog-content\") pod \"redhat-operators-hqqdp\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.776174 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbm5m\" (UniqueName: \"kubernetes.io/projected/4b3c67f2-3a77-4bff-9430-722c98d31f95-kube-api-access-sbm5m\") pod \"redhat-operators-hqqdp\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:42 crc kubenswrapper[4807]: I1127 12:15:42.923285 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:43 crc kubenswrapper[4807]: I1127 12:15:43.369027 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hqqdp"] Nov 27 12:15:43 crc kubenswrapper[4807]: I1127 12:15:43.869351 4807 generic.go:334] "Generic (PLEG): container finished" podID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerID="f89f6c5d6d48c6a18afc79e62c73b32f0913b041eef230fca1f01f0c7d8bb8f7" exitCode=0 Nov 27 12:15:43 crc kubenswrapper[4807]: I1127 12:15:43.869613 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqdp" event={"ID":"4b3c67f2-3a77-4bff-9430-722c98d31f95","Type":"ContainerDied","Data":"f89f6c5d6d48c6a18afc79e62c73b32f0913b041eef230fca1f01f0c7d8bb8f7"} Nov 27 12:15:43 crc kubenswrapper[4807]: I1127 12:15:43.869639 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqdp" event={"ID":"4b3c67f2-3a77-4bff-9430-722c98d31f95","Type":"ContainerStarted","Data":"46cbc6bd2eec42d2ee894c0e4327ca7f1afe8024e202a6caf3eab895b1ef77c4"} Nov 27 12:15:43 crc kubenswrapper[4807]: I1127 12:15:43.871737 4807 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 27 12:15:44 crc kubenswrapper[4807]: I1127 12:15:44.533793 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:15:44 crc kubenswrapper[4807]: E1127 12:15:44.534327 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:15:44 crc kubenswrapper[4807]: I1127 12:15:44.893393 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqdp" event={"ID":"4b3c67f2-3a77-4bff-9430-722c98d31f95","Type":"ContainerStarted","Data":"f7c5e27fffc57b59044bb73710961bbdb53f20bc6da1f3be8e84f77c44d7b356"} Nov 27 12:15:45 crc kubenswrapper[4807]: I1127 12:15:45.904101 4807 generic.go:334] "Generic (PLEG): container finished" podID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerID="f7c5e27fffc57b59044bb73710961bbdb53f20bc6da1f3be8e84f77c44d7b356" exitCode=0 Nov 27 12:15:45 crc kubenswrapper[4807]: I1127 12:15:45.904484 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqdp" event={"ID":"4b3c67f2-3a77-4bff-9430-722c98d31f95","Type":"ContainerDied","Data":"f7c5e27fffc57b59044bb73710961bbdb53f20bc6da1f3be8e84f77c44d7b356"} Nov 27 12:15:47 crc kubenswrapper[4807]: I1127 12:15:47.927573 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqdp" event={"ID":"4b3c67f2-3a77-4bff-9430-722c98d31f95","Type":"ContainerStarted","Data":"3016ee4bfa980f81a0e899623b1afdc97fefb9c2795bc9295768370b82d7a0fa"} Nov 27 12:15:47 crc kubenswrapper[4807]: I1127 12:15:47.956187 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hqqdp" podStartSLOduration=3.318932793 podStartE2EDuration="5.956163888s" podCreationTimestamp="2025-11-27 12:15:42 +0000 UTC" firstStartedPulling="2025-11-27 12:15:43.871464587 +0000 UTC m=+3984.970962775" lastFinishedPulling="2025-11-27 12:15:46.508695672 +0000 UTC m=+3987.608193870" observedRunningTime="2025-11-27 12:15:47.947313382 +0000 UTC m=+3989.046811570" watchObservedRunningTime="2025-11-27 12:15:47.956163888 +0000 UTC m=+3989.055662116" Nov 27 12:15:52 crc kubenswrapper[4807]: I1127 12:15:52.923739 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:52 crc kubenswrapper[4807]: I1127 12:15:52.924234 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:52 crc kubenswrapper[4807]: I1127 12:15:52.980663 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:53 crc kubenswrapper[4807]: I1127 12:15:53.028641 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:53 crc kubenswrapper[4807]: I1127 12:15:53.216270 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqqdp"] Nov 27 12:15:54 crc kubenswrapper[4807]: I1127 12:15:54.983196 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hqqdp" podUID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerName="registry-server" containerID="cri-o://3016ee4bfa980f81a0e899623b1afdc97fefb9c2795bc9295768370b82d7a0fa" gracePeriod=2 Nov 27 12:15:55 crc kubenswrapper[4807]: I1127 12:15:55.995965 4807 generic.go:334] "Generic (PLEG): container finished" podID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerID="3016ee4bfa980f81a0e899623b1afdc97fefb9c2795bc9295768370b82d7a0fa" exitCode=0 Nov 27 12:15:55 crc kubenswrapper[4807]: I1127 12:15:55.996012 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqdp" event={"ID":"4b3c67f2-3a77-4bff-9430-722c98d31f95","Type":"ContainerDied","Data":"3016ee4bfa980f81a0e899623b1afdc97fefb9c2795bc9295768370b82d7a0fa"} Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.542980 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:15:56 crc kubenswrapper[4807]: E1127 12:15:56.543465 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.640821 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.738693 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbm5m\" (UniqueName: \"kubernetes.io/projected/4b3c67f2-3a77-4bff-9430-722c98d31f95-kube-api-access-sbm5m\") pod \"4b3c67f2-3a77-4bff-9430-722c98d31f95\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.738750 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-catalog-content\") pod \"4b3c67f2-3a77-4bff-9430-722c98d31f95\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.738775 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-utilities\") pod \"4b3c67f2-3a77-4bff-9430-722c98d31f95\" (UID: \"4b3c67f2-3a77-4bff-9430-722c98d31f95\") " Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.740335 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-utilities" (OuterVolumeSpecName: "utilities") pod "4b3c67f2-3a77-4bff-9430-722c98d31f95" (UID: "4b3c67f2-3a77-4bff-9430-722c98d31f95"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.762647 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3c67f2-3a77-4bff-9430-722c98d31f95-kube-api-access-sbm5m" (OuterVolumeSpecName: "kube-api-access-sbm5m") pod "4b3c67f2-3a77-4bff-9430-722c98d31f95" (UID: "4b3c67f2-3a77-4bff-9430-722c98d31f95"). InnerVolumeSpecName "kube-api-access-sbm5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.836073 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b3c67f2-3a77-4bff-9430-722c98d31f95" (UID: "4b3c67f2-3a77-4bff-9430-722c98d31f95"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.840316 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbm5m\" (UniqueName: \"kubernetes.io/projected/4b3c67f2-3a77-4bff-9430-722c98d31f95-kube-api-access-sbm5m\") on node \"crc\" DevicePath \"\"" Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.840436 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 12:15:56 crc kubenswrapper[4807]: I1127 12:15:56.840505 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b3c67f2-3a77-4bff-9430-722c98d31f95-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 12:15:57 crc kubenswrapper[4807]: I1127 12:15:57.009455 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hqqdp" event={"ID":"4b3c67f2-3a77-4bff-9430-722c98d31f95","Type":"ContainerDied","Data":"46cbc6bd2eec42d2ee894c0e4327ca7f1afe8024e202a6caf3eab895b1ef77c4"} Nov 27 12:15:57 crc kubenswrapper[4807]: I1127 12:15:57.010604 4807 scope.go:117] "RemoveContainer" containerID="3016ee4bfa980f81a0e899623b1afdc97fefb9c2795bc9295768370b82d7a0fa" Nov 27 12:15:57 crc kubenswrapper[4807]: I1127 12:15:57.009393 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hqqdp" Nov 27 12:15:57 crc kubenswrapper[4807]: I1127 12:15:57.047357 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hqqdp"] Nov 27 12:15:57 crc kubenswrapper[4807]: I1127 12:15:57.052701 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hqqdp"] Nov 27 12:15:57 crc kubenswrapper[4807]: I1127 12:15:57.053740 4807 scope.go:117] "RemoveContainer" containerID="f7c5e27fffc57b59044bb73710961bbdb53f20bc6da1f3be8e84f77c44d7b356" Nov 27 12:15:57 crc kubenswrapper[4807]: I1127 12:15:57.087271 4807 scope.go:117] "RemoveContainer" containerID="f89f6c5d6d48c6a18afc79e62c73b32f0913b041eef230fca1f01f0c7d8bb8f7" Nov 27 12:15:57 crc kubenswrapper[4807]: I1127 12:15:57.558854 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3c67f2-3a77-4bff-9430-722c98d31f95" path="/var/lib/kubelet/pods/4b3c67f2-3a77-4bff-9430-722c98d31f95/volumes" Nov 27 12:16:09 crc kubenswrapper[4807]: I1127 12:16:09.537551 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:16:09 crc kubenswrapper[4807]: E1127 12:16:09.538182 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:16:20 crc kubenswrapper[4807]: I1127 12:16:20.534278 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:16:20 crc kubenswrapper[4807]: E1127 12:16:20.535242 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:16:26 crc kubenswrapper[4807]: I1127 12:16:26.335365 4807 generic.go:334] "Generic (PLEG): container finished" podID="d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" containerID="dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33" exitCode=0 Nov 27 12:16:26 crc kubenswrapper[4807]: I1127 12:16:26.335472 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f58j6/must-gather-6bxh6" event={"ID":"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7","Type":"ContainerDied","Data":"dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33"} Nov 27 12:16:26 crc kubenswrapper[4807]: I1127 12:16:26.336638 4807 scope.go:117] "RemoveContainer" containerID="dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33" Nov 27 12:16:26 crc kubenswrapper[4807]: I1127 12:16:26.782954 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f58j6_must-gather-6bxh6_d508b66b-7e3a-4db5-9884-c64fcdd3c1c7/gather/0.log" Nov 27 12:16:31 crc kubenswrapper[4807]: I1127 12:16:31.532560 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:16:31 crc kubenswrapper[4807]: E1127 12:16:31.534799 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:16:36 crc kubenswrapper[4807]: I1127 12:16:36.404571 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-f58j6/must-gather-6bxh6"] Nov 27 12:16:36 crc kubenswrapper[4807]: I1127 12:16:36.405131 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-f58j6/must-gather-6bxh6" podUID="d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" containerName="copy" containerID="cri-o://63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f" gracePeriod=2 Nov 27 12:16:36 crc kubenswrapper[4807]: I1127 12:16:36.414062 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-f58j6/must-gather-6bxh6"] Nov 27 12:16:36 crc kubenswrapper[4807]: I1127 12:16:36.849311 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f58j6_must-gather-6bxh6_d508b66b-7e3a-4db5-9884-c64fcdd3c1c7/copy/0.log" Nov 27 12:16:36 crc kubenswrapper[4807]: I1127 12:16:36.850302 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/must-gather-6bxh6" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.051564 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-must-gather-output\") pod \"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7\" (UID: \"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7\") " Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.051707 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x68h6\" (UniqueName: \"kubernetes.io/projected/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-kube-api-access-x68h6\") pod \"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7\" (UID: \"d508b66b-7e3a-4db5-9884-c64fcdd3c1c7\") " Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.058443 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-kube-api-access-x68h6" (OuterVolumeSpecName: "kube-api-access-x68h6") pod "d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" (UID: "d508b66b-7e3a-4db5-9884-c64fcdd3c1c7"). InnerVolumeSpecName "kube-api-access-x68h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.153334 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x68h6\" (UniqueName: \"kubernetes.io/projected/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-kube-api-access-x68h6\") on node \"crc\" DevicePath \"\"" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.178747 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" (UID: "d508b66b-7e3a-4db5-9884-c64fcdd3c1c7"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.254828 4807 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.445556 4807 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-f58j6_must-gather-6bxh6_d508b66b-7e3a-4db5-9884-c64fcdd3c1c7/copy/0.log" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.445960 4807 generic.go:334] "Generic (PLEG): container finished" podID="d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" containerID="63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f" exitCode=143 Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.446010 4807 scope.go:117] "RemoveContainer" containerID="63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.446054 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f58j6/must-gather-6bxh6" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.464916 4807 scope.go:117] "RemoveContainer" containerID="dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.544814 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" path="/var/lib/kubelet/pods/d508b66b-7e3a-4db5-9884-c64fcdd3c1c7/volumes" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.546967 4807 scope.go:117] "RemoveContainer" containerID="63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f" Nov 27 12:16:37 crc kubenswrapper[4807]: E1127 12:16:37.547658 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f\": container with ID starting with 63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f not found: ID does not exist" containerID="63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.547702 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f"} err="failed to get container status \"63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f\": rpc error: code = NotFound desc = could not find container \"63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f\": container with ID starting with 63ce64dc9451dce0f5f4174f9798237bfc6387894dac5dff99ce9aad2c93200f not found: ID does not exist" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.547730 4807 scope.go:117] "RemoveContainer" containerID="dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33" Nov 27 12:16:37 crc kubenswrapper[4807]: E1127 12:16:37.548119 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33\": container with ID starting with dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33 not found: ID does not exist" containerID="dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33" Nov 27 12:16:37 crc kubenswrapper[4807]: I1127 12:16:37.548174 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33"} err="failed to get container status \"dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33\": rpc error: code = NotFound desc = could not find container \"dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33\": container with ID starting with dde92ac6f8295a88872b3fd94dc63ede50d3a8a70691b30f0d94ed95d4f4de33 not found: ID does not exist" Nov 27 12:16:42 crc kubenswrapper[4807]: I1127 12:16:42.533127 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:16:42 crc kubenswrapper[4807]: E1127 12:16:42.534264 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:16:53 crc kubenswrapper[4807]: I1127 12:16:53.532797 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:16:53 crc kubenswrapper[4807]: E1127 12:16:53.534182 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:17:06 crc kubenswrapper[4807]: I1127 12:17:06.532608 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:17:06 crc kubenswrapper[4807]: E1127 12:17:06.533570 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:17:19 crc kubenswrapper[4807]: I1127 12:17:19.545102 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:17:19 crc kubenswrapper[4807]: E1127 12:17:19.545912 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:17:34 crc kubenswrapper[4807]: I1127 12:17:34.532983 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:17:34 crc kubenswrapper[4807]: E1127 12:17:34.533924 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:17:46 crc kubenswrapper[4807]: I1127 12:17:46.532181 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:17:46 crc kubenswrapper[4807]: E1127 12:17:46.532997 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:18:01 crc kubenswrapper[4807]: I1127 12:18:01.533215 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:18:01 crc kubenswrapper[4807]: E1127 12:18:01.534377 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:18:16 crc kubenswrapper[4807]: I1127 12:18:16.532351 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:18:16 crc kubenswrapper[4807]: E1127 12:18:16.533028 4807 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kk425_openshift-machine-config-operator(aaae6992-39ea-4c99-b5e5-b4c025ec48f7)\"" pod="openshift-machine-config-operator/machine-config-daemon-kk425" podUID="aaae6992-39ea-4c99-b5e5-b4c025ec48f7" Nov 27 12:18:26 crc kubenswrapper[4807]: I1127 12:18:26.957246 4807 scope.go:117] "RemoveContainer" containerID="b7a13278067b3e81b2da1e6dd0301ec24dff504783147b8119ad18e19f223f50" Nov 27 12:18:26 crc kubenswrapper[4807]: I1127 12:18:26.979746 4807 scope.go:117] "RemoveContainer" containerID="49d6767c587a25896be808d45f804b755469c731d56af0db60ae8fdae184ec96" Nov 27 12:18:27 crc kubenswrapper[4807]: I1127 12:18:27.540706 4807 scope.go:117] "RemoveContainer" containerID="e1eb31f7787de45e81dbb1f60a29fa6da255d4b582bd363dc9a64ac03cf26a91" Nov 27 12:18:28 crc kubenswrapper[4807]: I1127 12:18:28.536355 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kk425" event={"ID":"aaae6992-39ea-4c99-b5e5-b4c025ec48f7","Type":"ContainerStarted","Data":"39a844bf819d966b482f717e473c68c67e5939a60058e9cadc6dc0d2a5d83e48"} Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.740418 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zc2lq"] Nov 27 12:18:40 crc kubenswrapper[4807]: E1127 12:18:40.741855 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" containerName="copy" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.741880 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" containerName="copy" Nov 27 12:18:40 crc kubenswrapper[4807]: E1127 12:18:40.741898 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerName="extract-content" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.741911 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerName="extract-content" Nov 27 12:18:40 crc kubenswrapper[4807]: E1127 12:18:40.741937 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerName="extract-utilities" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.741952 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerName="extract-utilities" Nov 27 12:18:40 crc kubenswrapper[4807]: E1127 12:18:40.741994 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" containerName="gather" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.742006 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" containerName="gather" Nov 27 12:18:40 crc kubenswrapper[4807]: E1127 12:18:40.742039 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerName="registry-server" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.742051 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerName="registry-server" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.742432 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" containerName="copy" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.742461 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="d508b66b-7e3a-4db5-9884-c64fcdd3c1c7" containerName="gather" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.742482 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3c67f2-3a77-4bff-9430-722c98d31f95" containerName="registry-server" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.745090 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.754843 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zc2lq"] Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.796308 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-catalog-content\") pod \"redhat-marketplace-zc2lq\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.796487 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-utilities\") pod \"redhat-marketplace-zc2lq\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.796651 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwwxq\" (UniqueName: \"kubernetes.io/projected/ebb60cf8-77f7-4af6-a101-703070eff72f-kube-api-access-zwwxq\") pod \"redhat-marketplace-zc2lq\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.898371 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-catalog-content\") pod \"redhat-marketplace-zc2lq\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.898483 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-utilities\") pod \"redhat-marketplace-zc2lq\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.898559 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwwxq\" (UniqueName: \"kubernetes.io/projected/ebb60cf8-77f7-4af6-a101-703070eff72f-kube-api-access-zwwxq\") pod \"redhat-marketplace-zc2lq\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.898840 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-catalog-content\") pod \"redhat-marketplace-zc2lq\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.899299 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-utilities\") pod \"redhat-marketplace-zc2lq\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:40 crc kubenswrapper[4807]: I1127 12:18:40.918469 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwwxq\" (UniqueName: \"kubernetes.io/projected/ebb60cf8-77f7-4af6-a101-703070eff72f-kube-api-access-zwwxq\") pod \"redhat-marketplace-zc2lq\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:41 crc kubenswrapper[4807]: I1127 12:18:41.080239 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:41 crc kubenswrapper[4807]: I1127 12:18:41.550548 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zc2lq"] Nov 27 12:18:41 crc kubenswrapper[4807]: I1127 12:18:41.717663 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zc2lq" event={"ID":"ebb60cf8-77f7-4af6-a101-703070eff72f","Type":"ContainerStarted","Data":"e724edbe930e2d8a39b10c66a765f2500746d55f8c5662681d15aa0df400911d"} Nov 27 12:18:42 crc kubenswrapper[4807]: I1127 12:18:42.727648 4807 generic.go:334] "Generic (PLEG): container finished" podID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerID="6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7" exitCode=0 Nov 27 12:18:42 crc kubenswrapper[4807]: I1127 12:18:42.727736 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zc2lq" event={"ID":"ebb60cf8-77f7-4af6-a101-703070eff72f","Type":"ContainerDied","Data":"6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7"} Nov 27 12:18:43 crc kubenswrapper[4807]: I1127 12:18:43.740714 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zc2lq" event={"ID":"ebb60cf8-77f7-4af6-a101-703070eff72f","Type":"ContainerStarted","Data":"723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32"} Nov 27 12:18:44 crc kubenswrapper[4807]: I1127 12:18:44.752728 4807 generic.go:334] "Generic (PLEG): container finished" podID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerID="723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32" exitCode=0 Nov 27 12:18:44 crc kubenswrapper[4807]: I1127 12:18:44.752835 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zc2lq" event={"ID":"ebb60cf8-77f7-4af6-a101-703070eff72f","Type":"ContainerDied","Data":"723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32"} Nov 27 12:18:44 crc kubenswrapper[4807]: I1127 12:18:44.753075 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zc2lq" event={"ID":"ebb60cf8-77f7-4af6-a101-703070eff72f","Type":"ContainerStarted","Data":"d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680"} Nov 27 12:18:44 crc kubenswrapper[4807]: I1127 12:18:44.775156 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zc2lq" podStartSLOduration=3.21909396 podStartE2EDuration="4.775128128s" podCreationTimestamp="2025-11-27 12:18:40 +0000 UTC" firstStartedPulling="2025-11-27 12:18:42.729477786 +0000 UTC m=+4163.828975994" lastFinishedPulling="2025-11-27 12:18:44.285511954 +0000 UTC m=+4165.385010162" observedRunningTime="2025-11-27 12:18:44.772665973 +0000 UTC m=+4165.872164181" watchObservedRunningTime="2025-11-27 12:18:44.775128128 +0000 UTC m=+4165.874626356" Nov 27 12:18:51 crc kubenswrapper[4807]: I1127 12:18:51.080403 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:51 crc kubenswrapper[4807]: I1127 12:18:51.082659 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:51 crc kubenswrapper[4807]: I1127 12:18:51.133020 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:51 crc kubenswrapper[4807]: I1127 12:18:51.869831 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:54 crc kubenswrapper[4807]: I1127 12:18:54.594483 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zc2lq"] Nov 27 12:18:54 crc kubenswrapper[4807]: I1127 12:18:54.840020 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zc2lq" podUID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerName="registry-server" containerID="cri-o://d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680" gracePeriod=2 Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.278787 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.380024 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-utilities\") pod \"ebb60cf8-77f7-4af6-a101-703070eff72f\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.380167 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-catalog-content\") pod \"ebb60cf8-77f7-4af6-a101-703070eff72f\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.380365 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwwxq\" (UniqueName: \"kubernetes.io/projected/ebb60cf8-77f7-4af6-a101-703070eff72f-kube-api-access-zwwxq\") pod \"ebb60cf8-77f7-4af6-a101-703070eff72f\" (UID: \"ebb60cf8-77f7-4af6-a101-703070eff72f\") " Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.381021 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-utilities" (OuterVolumeSpecName: "utilities") pod "ebb60cf8-77f7-4af6-a101-703070eff72f" (UID: "ebb60cf8-77f7-4af6-a101-703070eff72f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.386123 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb60cf8-77f7-4af6-a101-703070eff72f-kube-api-access-zwwxq" (OuterVolumeSpecName: "kube-api-access-zwwxq") pod "ebb60cf8-77f7-4af6-a101-703070eff72f" (UID: "ebb60cf8-77f7-4af6-a101-703070eff72f"). InnerVolumeSpecName "kube-api-access-zwwxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.410392 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebb60cf8-77f7-4af6-a101-703070eff72f" (UID: "ebb60cf8-77f7-4af6-a101-703070eff72f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.483355 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.483642 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb60cf8-77f7-4af6-a101-703070eff72f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.483822 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwwxq\" (UniqueName: \"kubernetes.io/projected/ebb60cf8-77f7-4af6-a101-703070eff72f-kube-api-access-zwwxq\") on node \"crc\" DevicePath \"\"" Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.864408 4807 generic.go:334] "Generic (PLEG): container finished" podID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerID="d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680" exitCode=0 Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.864453 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zc2lq" Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.864472 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zc2lq" event={"ID":"ebb60cf8-77f7-4af6-a101-703070eff72f","Type":"ContainerDied","Data":"d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680"} Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.866859 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zc2lq" event={"ID":"ebb60cf8-77f7-4af6-a101-703070eff72f","Type":"ContainerDied","Data":"e724edbe930e2d8a39b10c66a765f2500746d55f8c5662681d15aa0df400911d"} Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.866877 4807 scope.go:117] "RemoveContainer" containerID="d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680" Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.910473 4807 scope.go:117] "RemoveContainer" containerID="723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32" Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.914109 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zc2lq"] Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.930915 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zc2lq"] Nov 27 12:18:55 crc kubenswrapper[4807]: I1127 12:18:55.967585 4807 scope.go:117] "RemoveContainer" containerID="6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7" Nov 27 12:18:56 crc kubenswrapper[4807]: I1127 12:18:56.002209 4807 scope.go:117] "RemoveContainer" containerID="d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680" Nov 27 12:18:56 crc kubenswrapper[4807]: E1127 12:18:56.002737 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680\": container with ID starting with d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680 not found: ID does not exist" containerID="d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680" Nov 27 12:18:56 crc kubenswrapper[4807]: I1127 12:18:56.002787 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680"} err="failed to get container status \"d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680\": rpc error: code = NotFound desc = could not find container \"d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680\": container with ID starting with d34d6fd0128f9cc520124bf9d619ca13256095bb63b89c697e1400c7d9354680 not found: ID does not exist" Nov 27 12:18:56 crc kubenswrapper[4807]: I1127 12:18:56.002820 4807 scope.go:117] "RemoveContainer" containerID="723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32" Nov 27 12:18:56 crc kubenswrapper[4807]: E1127 12:18:56.003290 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32\": container with ID starting with 723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32 not found: ID does not exist" containerID="723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32" Nov 27 12:18:56 crc kubenswrapper[4807]: I1127 12:18:56.003335 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32"} err="failed to get container status \"723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32\": rpc error: code = NotFound desc = could not find container \"723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32\": container with ID starting with 723d1c8ea1c06587976b25f0bdc5a4f87395fecbdf700dd02f68911943892e32 not found: ID does not exist" Nov 27 12:18:56 crc kubenswrapper[4807]: I1127 12:18:56.003366 4807 scope.go:117] "RemoveContainer" containerID="6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7" Nov 27 12:18:56 crc kubenswrapper[4807]: E1127 12:18:56.003698 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7\": container with ID starting with 6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7 not found: ID does not exist" containerID="6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7" Nov 27 12:18:56 crc kubenswrapper[4807]: I1127 12:18:56.003739 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7"} err="failed to get container status \"6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7\": rpc error: code = NotFound desc = could not find container \"6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7\": container with ID starting with 6c492cd1b7b649a3629d302208771a0d74227f1e69d4ba6889c4faf22734adc7 not found: ID does not exist" Nov 27 12:18:57 crc kubenswrapper[4807]: I1127 12:18:57.549832 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb60cf8-77f7-4af6-a101-703070eff72f" path="/var/lib/kubelet/pods/ebb60cf8-77f7-4af6-a101-703070eff72f/volumes" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.395152 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d7m75"] Nov 27 12:18:59 crc kubenswrapper[4807]: E1127 12:18:59.395772 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerName="registry-server" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.395785 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerName="registry-server" Nov 27 12:18:59 crc kubenswrapper[4807]: E1127 12:18:59.395806 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerName="extract-utilities" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.395812 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerName="extract-utilities" Nov 27 12:18:59 crc kubenswrapper[4807]: E1127 12:18:59.395840 4807 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerName="extract-content" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.395847 4807 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerName="extract-content" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.396031 4807 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebb60cf8-77f7-4af6-a101-703070eff72f" containerName="registry-server" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.397424 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.418831 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7m75"] Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.576345 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-utilities\") pod \"certified-operators-d7m75\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.576414 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9ckb\" (UniqueName: \"kubernetes.io/projected/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-kube-api-access-z9ckb\") pod \"certified-operators-d7m75\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.576497 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-catalog-content\") pod \"certified-operators-d7m75\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.678155 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-utilities\") pod \"certified-operators-d7m75\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.678293 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9ckb\" (UniqueName: \"kubernetes.io/projected/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-kube-api-access-z9ckb\") pod \"certified-operators-d7m75\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.678449 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-catalog-content\") pod \"certified-operators-d7m75\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.678710 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-utilities\") pod \"certified-operators-d7m75\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.679020 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-catalog-content\") pod \"certified-operators-d7m75\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:18:59 crc kubenswrapper[4807]: I1127 12:18:59.924698 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9ckb\" (UniqueName: \"kubernetes.io/projected/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-kube-api-access-z9ckb\") pod \"certified-operators-d7m75\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.034329 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.479562 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d7m75"] Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.599406 4807 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ll275"] Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.601802 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.610476 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ll275"] Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.701164 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2ttx\" (UniqueName: \"kubernetes.io/projected/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-kube-api-access-x2ttx\") pod \"community-operators-ll275\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.701321 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-utilities\") pod \"community-operators-ll275\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.701371 4807 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-catalog-content\") pod \"community-operators-ll275\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.802818 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-utilities\") pod \"community-operators-ll275\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.803085 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-catalog-content\") pod \"community-operators-ll275\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.803200 4807 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2ttx\" (UniqueName: \"kubernetes.io/projected/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-kube-api-access-x2ttx\") pod \"community-operators-ll275\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.803398 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-utilities\") pod \"community-operators-ll275\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.803667 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-catalog-content\") pod \"community-operators-ll275\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.822543 4807 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2ttx\" (UniqueName: \"kubernetes.io/projected/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-kube-api-access-x2ttx\") pod \"community-operators-ll275\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.926129 4807 generic.go:334] "Generic (PLEG): container finished" podID="9d25dcd8-31ee-4f09-86c6-1e170c1909b4" containerID="d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98" exitCode=0 Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.926193 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7m75" event={"ID":"9d25dcd8-31ee-4f09-86c6-1e170c1909b4","Type":"ContainerDied","Data":"d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98"} Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.926228 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7m75" event={"ID":"9d25dcd8-31ee-4f09-86c6-1e170c1909b4","Type":"ContainerStarted","Data":"a1799a3cab3f8eae647ad6814e4d1e6cb3282061eff0e88c11e481038c1f9bee"} Nov 27 12:19:00 crc kubenswrapper[4807]: I1127 12:19:00.938138 4807 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:01 crc kubenswrapper[4807]: I1127 12:19:01.467990 4807 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ll275"] Nov 27 12:19:01 crc kubenswrapper[4807]: I1127 12:19:01.938460 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7m75" event={"ID":"9d25dcd8-31ee-4f09-86c6-1e170c1909b4","Type":"ContainerStarted","Data":"657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0"} Nov 27 12:19:01 crc kubenswrapper[4807]: I1127 12:19:01.940717 4807 generic.go:334] "Generic (PLEG): container finished" podID="b0c1364f-ec1e-4ff0-a98a-9b9f279f0707" containerID="88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33" exitCode=0 Nov 27 12:19:01 crc kubenswrapper[4807]: I1127 12:19:01.940779 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll275" event={"ID":"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707","Type":"ContainerDied","Data":"88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33"} Nov 27 12:19:01 crc kubenswrapper[4807]: I1127 12:19:01.940812 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll275" event={"ID":"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707","Type":"ContainerStarted","Data":"7f9a964baa68faffa9cff13dcd36be89c45088d36492fd0756c2d2e1c32b9a21"} Nov 27 12:19:02 crc kubenswrapper[4807]: I1127 12:19:02.951097 4807 generic.go:334] "Generic (PLEG): container finished" podID="9d25dcd8-31ee-4f09-86c6-1e170c1909b4" containerID="657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0" exitCode=0 Nov 27 12:19:02 crc kubenswrapper[4807]: I1127 12:19:02.951169 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7m75" event={"ID":"9d25dcd8-31ee-4f09-86c6-1e170c1909b4","Type":"ContainerDied","Data":"657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0"} Nov 27 12:19:03 crc kubenswrapper[4807]: I1127 12:19:03.961888 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7m75" event={"ID":"9d25dcd8-31ee-4f09-86c6-1e170c1909b4","Type":"ContainerStarted","Data":"d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142"} Nov 27 12:19:03 crc kubenswrapper[4807]: I1127 12:19:03.963981 4807 generic.go:334] "Generic (PLEG): container finished" podID="b0c1364f-ec1e-4ff0-a98a-9b9f279f0707" containerID="fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb" exitCode=0 Nov 27 12:19:03 crc kubenswrapper[4807]: I1127 12:19:03.964061 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll275" event={"ID":"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707","Type":"ContainerDied","Data":"fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb"} Nov 27 12:19:03 crc kubenswrapper[4807]: I1127 12:19:03.989694 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d7m75" podStartSLOduration=2.408769475 podStartE2EDuration="4.989673899s" podCreationTimestamp="2025-11-27 12:18:59 +0000 UTC" firstStartedPulling="2025-11-27 12:19:00.933967484 +0000 UTC m=+4182.033465682" lastFinishedPulling="2025-11-27 12:19:03.514871918 +0000 UTC m=+4184.614370106" observedRunningTime="2025-11-27 12:19:03.981739987 +0000 UTC m=+4185.081238185" watchObservedRunningTime="2025-11-27 12:19:03.989673899 +0000 UTC m=+4185.089172097" Nov 27 12:19:04 crc kubenswrapper[4807]: I1127 12:19:04.974488 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll275" event={"ID":"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707","Type":"ContainerStarted","Data":"72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7"} Nov 27 12:19:04 crc kubenswrapper[4807]: I1127 12:19:04.995424 4807 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ll275" podStartSLOduration=2.484380273 podStartE2EDuration="4.995408615s" podCreationTimestamp="2025-11-27 12:19:00 +0000 UTC" firstStartedPulling="2025-11-27 12:19:01.943232295 +0000 UTC m=+4183.042730503" lastFinishedPulling="2025-11-27 12:19:04.454260647 +0000 UTC m=+4185.553758845" observedRunningTime="2025-11-27 12:19:04.987761691 +0000 UTC m=+4186.087259889" watchObservedRunningTime="2025-11-27 12:19:04.995408615 +0000 UTC m=+4186.094906803" Nov 27 12:19:10 crc kubenswrapper[4807]: I1127 12:19:10.035437 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:19:10 crc kubenswrapper[4807]: I1127 12:19:10.035810 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:19:10 crc kubenswrapper[4807]: I1127 12:19:10.581270 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:19:10 crc kubenswrapper[4807]: I1127 12:19:10.938467 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:10 crc kubenswrapper[4807]: I1127 12:19:10.938511 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:10 crc kubenswrapper[4807]: I1127 12:19:10.982842 4807 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:11 crc kubenswrapper[4807]: I1127 12:19:11.093921 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:11 crc kubenswrapper[4807]: I1127 12:19:11.100041 4807 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:19:13 crc kubenswrapper[4807]: I1127 12:19:13.988718 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7m75"] Nov 27 12:19:13 crc kubenswrapper[4807]: I1127 12:19:13.989280 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d7m75" podUID="9d25dcd8-31ee-4f09-86c6-1e170c1909b4" containerName="registry-server" containerID="cri-o://d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142" gracePeriod=2 Nov 27 12:19:14 crc kubenswrapper[4807]: I1127 12:19:14.557589 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:19:14 crc kubenswrapper[4807]: I1127 12:19:14.714432 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-utilities\") pod \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " Nov 27 12:19:14 crc kubenswrapper[4807]: I1127 12:19:14.714602 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-catalog-content\") pod \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " Nov 27 12:19:14 crc kubenswrapper[4807]: I1127 12:19:14.714731 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9ckb\" (UniqueName: \"kubernetes.io/projected/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-kube-api-access-z9ckb\") pod \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\" (UID: \"9d25dcd8-31ee-4f09-86c6-1e170c1909b4\") " Nov 27 12:19:14 crc kubenswrapper[4807]: I1127 12:19:14.715454 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-utilities" (OuterVolumeSpecName: "utilities") pod "9d25dcd8-31ee-4f09-86c6-1e170c1909b4" (UID: "9d25dcd8-31ee-4f09-86c6-1e170c1909b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:19:14 crc kubenswrapper[4807]: I1127 12:19:14.723709 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-kube-api-access-z9ckb" (OuterVolumeSpecName: "kube-api-access-z9ckb") pod "9d25dcd8-31ee-4f09-86c6-1e170c1909b4" (UID: "9d25dcd8-31ee-4f09-86c6-1e170c1909b4"). InnerVolumeSpecName "kube-api-access-z9ckb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:19:14 crc kubenswrapper[4807]: I1127 12:19:14.786655 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d25dcd8-31ee-4f09-86c6-1e170c1909b4" (UID: "9d25dcd8-31ee-4f09-86c6-1e170c1909b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:19:14 crc kubenswrapper[4807]: I1127 12:19:14.817004 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 12:19:14 crc kubenswrapper[4807]: I1127 12:19:14.817044 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 12:19:14 crc kubenswrapper[4807]: I1127 12:19:14.817058 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9ckb\" (UniqueName: \"kubernetes.io/projected/9d25dcd8-31ee-4f09-86c6-1e170c1909b4-kube-api-access-z9ckb\") on node \"crc\" DevicePath \"\"" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.082706 4807 generic.go:334] "Generic (PLEG): container finished" podID="9d25dcd8-31ee-4f09-86c6-1e170c1909b4" containerID="d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142" exitCode=0 Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.082779 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7m75" event={"ID":"9d25dcd8-31ee-4f09-86c6-1e170c1909b4","Type":"ContainerDied","Data":"d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142"} Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.082818 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d7m75" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.082862 4807 scope.go:117] "RemoveContainer" containerID="d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.082831 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d7m75" event={"ID":"9d25dcd8-31ee-4f09-86c6-1e170c1909b4","Type":"ContainerDied","Data":"a1799a3cab3f8eae647ad6814e4d1e6cb3282061eff0e88c11e481038c1f9bee"} Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.117083 4807 scope.go:117] "RemoveContainer" containerID="657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.146495 4807 scope.go:117] "RemoveContainer" containerID="d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.151421 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d7m75"] Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.162759 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d7m75"] Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.200846 4807 scope.go:117] "RemoveContainer" containerID="d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142" Nov 27 12:19:15 crc kubenswrapper[4807]: E1127 12:19:15.201280 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142\": container with ID starting with d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142 not found: ID does not exist" containerID="d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.201331 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142"} err="failed to get container status \"d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142\": rpc error: code = NotFound desc = could not find container \"d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142\": container with ID starting with d83d4b6f747ed8a460eb76d54150d309337a7837f0ce8acd017cc0e586738142 not found: ID does not exist" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.201350 4807 scope.go:117] "RemoveContainer" containerID="657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0" Nov 27 12:19:15 crc kubenswrapper[4807]: E1127 12:19:15.201754 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0\": container with ID starting with 657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0 not found: ID does not exist" containerID="657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.201782 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0"} err="failed to get container status \"657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0\": rpc error: code = NotFound desc = could not find container \"657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0\": container with ID starting with 657860758dcd7d3e51517f4032e28cc38080355e3427b59413fffe80b7071eb0 not found: ID does not exist" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.201826 4807 scope.go:117] "RemoveContainer" containerID="d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98" Nov 27 12:19:15 crc kubenswrapper[4807]: E1127 12:19:15.202174 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98\": container with ID starting with d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98 not found: ID does not exist" containerID="d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.202196 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98"} err="failed to get container status \"d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98\": rpc error: code = NotFound desc = could not find container \"d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98\": container with ID starting with d619ab048069a49b3e39c402d2f1ca831adecaf7982513b5168edbf500a90d98 not found: ID does not exist" Nov 27 12:19:15 crc kubenswrapper[4807]: I1127 12:19:15.548737 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d25dcd8-31ee-4f09-86c6-1e170c1909b4" path="/var/lib/kubelet/pods/9d25dcd8-31ee-4f09-86c6-1e170c1909b4/volumes" Nov 27 12:19:16 crc kubenswrapper[4807]: I1127 12:19:16.587753 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ll275"] Nov 27 12:19:16 crc kubenswrapper[4807]: I1127 12:19:16.588305 4807 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ll275" podUID="b0c1364f-ec1e-4ff0-a98a-9b9f279f0707" containerName="registry-server" containerID="cri-o://72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7" gracePeriod=2 Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.019892 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.059705 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-catalog-content\") pod \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.059850 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-utilities\") pod \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.059917 4807 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2ttx\" (UniqueName: \"kubernetes.io/projected/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-kube-api-access-x2ttx\") pod \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\" (UID: \"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707\") " Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.061355 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-utilities" (OuterVolumeSpecName: "utilities") pod "b0c1364f-ec1e-4ff0-a98a-9b9f279f0707" (UID: "b0c1364f-ec1e-4ff0-a98a-9b9f279f0707"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.065209 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-kube-api-access-x2ttx" (OuterVolumeSpecName: "kube-api-access-x2ttx") pod "b0c1364f-ec1e-4ff0-a98a-9b9f279f0707" (UID: "b0c1364f-ec1e-4ff0-a98a-9b9f279f0707"). InnerVolumeSpecName "kube-api-access-x2ttx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.110178 4807 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0c1364f-ec1e-4ff0-a98a-9b9f279f0707" (UID: "b0c1364f-ec1e-4ff0-a98a-9b9f279f0707"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.117033 4807 generic.go:334] "Generic (PLEG): container finished" podID="b0c1364f-ec1e-4ff0-a98a-9b9f279f0707" containerID="72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7" exitCode=0 Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.117095 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll275" event={"ID":"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707","Type":"ContainerDied","Data":"72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7"} Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.117103 4807 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ll275" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.117123 4807 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ll275" event={"ID":"b0c1364f-ec1e-4ff0-a98a-9b9f279f0707","Type":"ContainerDied","Data":"7f9a964baa68faffa9cff13dcd36be89c45088d36492fd0756c2d2e1c32b9a21"} Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.117158 4807 scope.go:117] "RemoveContainer" containerID="72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.141371 4807 scope.go:117] "RemoveContainer" containerID="fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.149807 4807 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ll275"] Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.159204 4807 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ll275"] Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.162636 4807 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-utilities\") on node \"crc\" DevicePath \"\"" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.162660 4807 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2ttx\" (UniqueName: \"kubernetes.io/projected/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-kube-api-access-x2ttx\") on node \"crc\" DevicePath \"\"" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.162669 4807 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.177878 4807 scope.go:117] "RemoveContainer" containerID="88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.205748 4807 scope.go:117] "RemoveContainer" containerID="72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7" Nov 27 12:19:17 crc kubenswrapper[4807]: E1127 12:19:17.206192 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7\": container with ID starting with 72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7 not found: ID does not exist" containerID="72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.206241 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7"} err="failed to get container status \"72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7\": rpc error: code = NotFound desc = could not find container \"72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7\": container with ID starting with 72e065f56a2f2da72c16558c66d4877683b53f0c6fe03049722b8adfa950c6a7 not found: ID does not exist" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.206301 4807 scope.go:117] "RemoveContainer" containerID="fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb" Nov 27 12:19:17 crc kubenswrapper[4807]: E1127 12:19:17.206639 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb\": container with ID starting with fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb not found: ID does not exist" containerID="fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.206676 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb"} err="failed to get container status \"fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb\": rpc error: code = NotFound desc = could not find container \"fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb\": container with ID starting with fd2b23a43e15da121dcfd28ca8e7dda2b730b20231a83a8400db8c6fc4e76adb not found: ID does not exist" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.206706 4807 scope.go:117] "RemoveContainer" containerID="88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33" Nov 27 12:19:17 crc kubenswrapper[4807]: E1127 12:19:17.207031 4807 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33\": container with ID starting with 88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33 not found: ID does not exist" containerID="88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.207074 4807 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33"} err="failed to get container status \"88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33\": rpc error: code = NotFound desc = could not find container \"88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33\": container with ID starting with 88788154fed3cddc03d65bb9419865ffd5cad6157af42f88cb0ef71caf6ceb33 not found: ID does not exist" Nov 27 12:19:17 crc kubenswrapper[4807]: I1127 12:19:17.543960 4807 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c1364f-ec1e-4ff0-a98a-9b9f279f0707" path="/var/lib/kubelet/pods/b0c1364f-ec1e-4ff0-a98a-9b9f279f0707/volumes"